About the project

The course:

The course Introduction to open data science started on Wednesday 30th October 2019.

My feelings:

I’m feeling so excited!!

I got aware of the course by an announcement some weeks ago on the University of Eastern Finland’s yammer platform. After checking the course material on MOOC platform I see great new things coming up for me to learn. So I’m very excited about it.
I’m looking forward to meet all the new things I will learn in this course and especially I’m looking forward to meet all challenges coming up in this course.
I hope I can manage all assignments.

I’m expecting to learn a lot about data handling, improve my R skills and hopefully be able to use this knowledge later on in my work.

Chapter 2: Regression and model validation

Data wrangling and regression analysis

Work of week 45 (4.11. - 10.11.2019)


1. Data wrangling

4.11.2019: Started to work through the DataCamp exercises offered by the IODS course and was able to finish

  • R Short and Sweet
  • R Helsinki Open Data Science courses

5.11.2019: Start to work on R script for data wrangling and regression analysis

  • Wrote the script for data wrangling and was able to finish that part (all tests on the script worked)

Now follows a discription through my work progress.

1.1. Read the dataframe

Script of the data read (reading the dataframe from the website)
# Read the data file ----
learning2014 <- read.table(file = 
                "https://www.mv.helsinki.fi/home/kvehkala/JYTmooc/JYTOPKYS3-data.txt", 
                sep = "\t", header = TRUE)                          
                # read the data file and assign it to the object "learning2014"

# Explore the read data ----

head(learning2014) # see the top of the dataset with the first 6 observations
str(learning2014) # check the structure of the dataset --> --> (observations = rows and variables = columns)
dim(learning2014) # check the table dimensions --> (observations = rows and variables = columns)

The dataset consists of 183 observations (rows) in 60 variables (columns).

If you want to check the data also with some diagrams, you can use for example following plots:

# Structure of some the data in plot or in a histogram (here the variable Points from)
library(ggplot2)
qplot(Attitude, Points, data = learning2014)
hist(learning2014$Points)

What follows now is the data wrangling. We have to summarize variables of deep, surface and strategic learning approaches and calclulate the mean value of the summarized variables. Other variables stay (gender, Attitude, Points), only the column names are changed.

1.2. Perform the data wrangling

Script for data wrangling

I decided to do the data wrangling into a new R object (a dataframe) which I named “lrn14_analysis”. This dataframe will later be used for the data analyses.
Here I used a more complicated way to do the data wrangling, in the meeting on Wednesday, 6.11. a way with the pipe operator %>% was presented.

# Create an analysis dataset: "lrn14_analysis" ----

# Create an analysis dataset with the variables gender, age, attitude, deep, stra, surf and points 
# (by combining questions in the learning2014 data)

library(dplyr) # load the package for data wrangling

keep_columns <- c("gender","Age","Attitude","Points") # these are the data columns which need to be kept
lrn14_analysis <- select(learning2014, one_of(keep_columns))  # assin a new object and select kept columns
colnames(lrn14_analysis) <- c("gender", "age", "attitude", "points") # change of the kept column names

# define questions (observations from variables) acc. instructions
deep_q <- c("D03", "D11", "D19", "D27", "D07", "D14", "D22", "D30","D06",  "D15", "D23", "D31")  # deep questions
surf_q <- c("SU02","SU10","SU18","SU26", "SU05","SU13","SU21","SU29","SU08","SU16","SU24","SU32") # surface questions
stra_q <- c("ST01","ST09","ST17","ST25","ST04","ST12","ST20","ST28") # strategic questions

# Select the combined variables (columns) & scale the observations (mean) and add it to the analysis dataframe
deep <- select(learning2014, one_of(deep_q))
lrn14_analysis$deep <- round(rowMeans(deep, na.rm = TRUE), digits = 2) # values are rounded to 2 digits

surf <- select(learning2014, one_of(surf_q))
lrn14_analysis$surf <- round(rowMeans(surf,na.rm = TRUE), digits = 2)

stra <- select(learning2014, one_of(stra_q))
lrn14_analysis$stra <- round(rowMeans(stra, na.rm = TRUE), digits = 2)

# devide the number of the attitude column by 10 (its a sum of 10 questions)
lrn14_analysis$attitude <- lrn14_analysis$attitude / 10

# Exclude observations where the exam points variable is zero. 
lrn14_analysis <- filter(lrn14_analysis, points > 0)

The new dataframe was created. Now the structure is checked.

# Check the analysis dataset
str(lrn14_analysis)
## 'data.frame':    166 obs. of  7 variables:
##  $ gender  : Factor w/ 2 levels "F","M": 1 2 1 2 2 1 2 1 2 1 ...
##  $ age     : int  53 55 49 53 49 38 50 37 37 42 ...
##  $ attitude: num  3.7 3.1 2.5 3.5 3.7 3.8 3.5 2.9 3.8 2.1 ...
##  $ points  : int  25 12 24 10 22 21 21 31 24 26 ...
##  $ deep    : num  3.58 2.92 3.5 3.5 3.67 4.75 3.83 3.25 4.33 4 ...
##  $ surf    : num  2.58 3.17 2.25 2.25 2.83 2.42 1.92 2.83 2.17 3 ...
##  $ stra    : num  3.38 2.75 3.62 3.12 3.62 3.62 2.25 4 4.25 3.5 ...
dim(lrn14_analysis)
## [1] 166   7

The data consists now of 7 variables (columns) and 166 observations. The object name is “lrn14_analysis”.


1.3. Safe the updated dataframe as a .txt table or .csv table

First the the working directory is set and then I save the dataframe with write.table() and the write.csv()functions.

# Set the working directory to IODS project folder ---- 
setwd("~/IODS-project") # set the wd to the IODS folder
# safe the analysis dataset to the "data" folder ----

write.table(lrn14_analysis, 
      file = "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.txt",
      sep = "\t", col.names = TRUE, row.names = TRUE)

write.csv(lrn14_analysis, 
      file = "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.csv",
      row.names = FALSE)

Afterwards I check if the tables can be read by R using the read.table() and read.csv() function plus the %>%operator piping to the head() function showing the first 6 observations of the dataframes.

# check if the table can be read ----
read.table(file = 
             "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.txt") %>% head()
##   gender age attitude points deep surf stra
## 1      F  53      3.7     25 3.58 2.58 3.38
## 2      M  55      3.1     12 2.92 3.17 2.75
## 3      F  49      2.5     24 3.50 2.25 3.62
## 4      M  53      3.5     10 3.50 2.25 3.12
## 5      M  49      3.7     22 3.67 2.83 3.62
## 6      F  38      3.8     21 4.75 2.42 3.62
read.csv(file = 
           "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.csv")  %>%head()
##   gender age attitude points deep surf stra
## 1      F  53      3.7     25 3.58 2.58 3.38
## 2      M  55      3.1     12 2.92 3.17 2.75
## 3      F  49      2.5     24 3.50 2.25 3.62
## 4      M  53      3.5     10 3.50 2.25 3.12
## 5      M  49      3.7     22 3.67 2.83 3.62
## 6      F  38      3.8     21 4.75 2.42 3.62

2. Analysis

The work on the analysis script and documentation started on 6.11.2019.

2.1. Reading the data

Read the dataset table from my data folder and checked the dataframe structure and dimensions.

# Set the working directory
setwd("~/IODS-project/data") # set work directory
# Read the data file ----
lrn14_analysis <- 
read.table(file = "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/lrn14_analysis_table.txt", stringsAsFactors = TRUE) 

lrn14_analysis %>% str() # read the data table and check the structure
## 'data.frame':    166 obs. of  7 variables:
##  $ gender  : Factor w/ 2 levels "F","M": 1 2 1 2 2 1 2 1 2 1 ...
##  $ age     : int  53 55 49 53 49 38 50 37 37 42 ...
##  $ attitude: num  3.7 3.1 2.5 3.5 3.7 3.8 3.5 2.9 3.8 2.1 ...
##  $ points  : int  25 12 24 10 22 21 21 31 24 26 ...
##  $ deep    : num  3.58 2.92 3.5 3.5 3.67 4.75 3.83 3.25 4.33 4 ...
##  $ surf    : num  2.58 3.17 2.25 2.25 2.83 2.42 1.92 2.83 2.17 3 ...
##  $ stra    : num  3.38 2.75 3.62 3.12 3.62 3.62 2.25 4 4.25 3.5 ...

When I read the data table I set the “stringsAsFactor” agument (in the read.table function) to TRUE. So the observations in the gender column “F” and “M” will become factors - here 1 & 2. The dataframe includes 166 observations (the rows) in 7 variables (the columns).

2.2. Graphical overview and data summary

The data we analyise consists of a survey from students relating their learning approaches. The data includes the students’ gender, age, exam points and the global attitude towards statistics (consisting of a sum of 10 questions related to students attitude towards statistics, each measured on the Likert scale (1-5). The attitude value was divided by 10 in the data wrangling part to show the value in the 1-5 scale.

For a graphical overview of the dataset I used the ggpairs() function which results in several plots and correlations of the observations between the different variables.

library(ggplot2) 
library(GGally) # to show the graph these packages need to be loaded

ov_lrn14_2 <- ggpairs(lrn14_analysis, mapping = aes(col = gender), title = "Graphical overview of lrn14_analysis", 
                      lower = list(combo = wrap("facethist", bins = 20)), 
                      upper = list(continuous = wrap("cor", size = 2.8)))

ov_lrn14_2 # show the graph

The overview plot shows the data distribution of all observations of each variable with a histogramm, Q-plots (point diagrams) and the line graphs. The data colours represent the different gender of the students of the data frame. Here female students are shown in redish colour and male students in turquise colour.
In the upper right part of the overview graph the data correlations (all observations of one variable correlating to the observations of another varibable) is shown. These results give a first hint which variable might show a significant regression analysis result.

# Save the overview plot

ggsave("OV_plot_lrn14.png", 
       plot = ov_lrn14_2, path = "~/IODS-project/data/", scale = 1, dpi = 300) 
# the graph is saved as .png file in my data folder

Additionally a data summary table of the lrn14_analysis dataset was created.

# Summary table of the lrn14_analysis data ----

Sum_table <- summary(lrn14_analysis)
Sum_table
##  gender       age           attitude         points           deep     
##  F:110   Min.   :17.00   Min.   :1.400   Min.   : 7.00   Min.   :1.58  
##  M: 56   1st Qu.:21.00   1st Qu.:2.600   1st Qu.:19.00   1st Qu.:3.33  
##          Median :22.00   Median :3.200   Median :23.00   Median :3.67  
##          Mean   :25.51   Mean   :3.143   Mean   :22.72   Mean   :3.68  
##          3rd Qu.:27.00   3rd Qu.:3.700   3rd Qu.:27.75   3rd Qu.:4.08  
##          Max.   :55.00   Max.   :5.000   Max.   :33.00   Max.   :4.92  
##       surf            stra      
##  Min.   :1.580   Min.   :1.250  
##  1st Qu.:2.420   1st Qu.:2.620  
##  Median :2.830   Median :3.185  
##  Mean   :2.787   Mean   :3.121  
##  3rd Qu.:3.170   3rd Qu.:3.620  
##  Max.   :4.330   Max.   :5.000

2.3. Regression models

2.3.1. Simple regression model of point ~ attitude

I was testing different simple regression models with the exam points as dependent variable (y) using function lm().

# Simple regressions ----

# regression with gender variable
gen_lm <- lm(points ~ gender, data = lrn14_analysis)
summary(gen_lm) # not significant

# regression with age variable
age_lm <- lm(points ~ age, data = lrn14_analysis)
summary(age_lm) # not significant

These regression analysis did not result in significant outcomes. Next the I performed the regression analysis of points ~ attitude. This had following outcome:

# attitude
att_lm <- lm(points ~ attitude, data = lrn14_analysis)
att_lm_res <- summary(att_lm)
att_lm_res
## 
## Call:
## lm(formula = points ~ attitude, data = lrn14_analysis)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -16.9763  -3.2119   0.4339   4.1534  10.6645 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  11.6372     1.8303   6.358 1.95e-09 ***
## attitude      3.5255     0.5674   6.214 4.12e-09 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 5.32 on 164 degrees of freedom
## Multiple R-squared:  0.1906, Adjusted R-squared:  0.1856 
## F-statistic: 38.61 on 1 and 164 DF,  p-value: 4.119e-09
knitr::kable(att_lm_res$coefficients, digits=3, caption="Regression coefficients point ~ attitude")
Regression coefficients point ~ attitude
Estimate Std. Error t value Pr(>|t|)
(Intercept) 11.637 1.830 6.358 0
attitude 3.525 0.567 6.214 0

Here the diagnostic plots of the regression model points ~ attitude:

plot(att_lm, which = c(1,2,5))

Regression model point ~ attitude interpretation:

Significant outcome was shown in the simple regression analysis of point ~ attitude.

The summary of the regression model shows that the estimate increase of points by a value of 1 of attitude is 3.525. The p-value of 4.12e-09 *** represents that as asignificant result (it’s pretty much 0). So the students’ attitude (relating statistics) has a significant influence in the exam points outcome. So depending on the attitude the student has a higher number of points or a lower one. But the intercept shows a value of 11.63 - so even with zero attitude a student would still reach a level of 11 points at an exam.
The multiple R-squared represents the correlation coefficient. It is the square root of R-squared (the coefficient of determination). The multiple R-squared explains how strong the linear relationship is, which would be strongest with a value of 1 and weakest with a value of 0 (no relationship). Here the value is 0.1906, so not a very strong relationship, but it has an effect.

The diagnostic plots consist of a "Residual vs. Fitted plot which is used to check the linearity (or non-linearity) of the observations, eventual variance errors (observations which concentrate in a certain direction) and you can also check if there are possible outliers in your dataset.
The resulting plots of our points ~ attitude model show a good linearity and a good variance. The observations are nicely distributed over the base line and you cannot see any direction were the points might show a variance difference. There are also three outliers (145, 56, 35), but these are not disturbing the analysis and don’t have to be removed since they are still centered in the overall data point distribution.
The Normal Q-Q plot helps us to check the data distribution of our dataset (normal distributed or exponential). Our datapoints are nicely distributed over the line and are normal distributed. That means that our analysis does not need any transformed data points before any further statistical analysis (for example logarithm or square root transformation).
The Residual vs. Leverage plot helps us to identify if there are observations which might have a high impact on a statistical model. So the observations in our case show very low leverage and even the points which are further away from the others show low leverage and so cannot be considered as outliers.

Additionally here is a plot representing points vs. attitude with the regressions line. For me it makes it easier to understand the whole thing when the data is shown is a graph.

library(ggplot2)
qplot(attitude, points, data = lrn14_analysis) + geom_smooth(method = "lm")

In the graph someone can check if the estimate increase of 3.525 per one unit of attitude is shown by the regression line.


2.3.2. Multiple regression model of point ~ attitude + stra

Here I’m checking the linear regression of points ~ attitude and strategic learning approach outcome - I want to know if the strategic learning approach has a relationship to the exampoints of the students.

# points ~ attitude + stra ----

att_st_lm <- lm(points ~ attitude + stra, data = lrn14_analysis)
att_st_lm_res <- summary(att_st_lm)
att_st_lm_res
## 
## Call:
## lm(formula = points ~ attitude + stra, data = lrn14_analysis)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -17.6482  -3.3135   0.5571   3.7966  10.9300 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)   8.9725     2.3966   3.744 0.000251 ***
## attitude      3.4664     0.5652   6.134 6.27e-09 ***
## stra          0.9132     0.5345   1.709 0.089438 .  
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 5.289 on 163 degrees of freedom
## Multiple R-squared:  0.2048, Adjusted R-squared:  0.195 
## F-statistic: 20.99 on 2 and 163 DF,  p-value: 7.746e-09
knitr::kable(att_st_lm_res$coefficients, digits=3, caption="Regression coefficients point ~ attitude + stra")
Regression coefficients point ~ attitude + stra
Estimate Std. Error t value Pr(>|t|)
(Intercept) 8.972 2.397 3.744 0.000
attitude 3.466 0.565 6.134 0.000
stra 0.913 0.534 1.709 0.089

Here the diagnostic plots of the regression model points ~ attitude + stra:

plot(att_st_lm, which = c(1,2,5))

Regression model point ~ attitude + stra interpretation:

In this multiple regression I wanted to see if the strategic learning approach shows an influence on the amount of exampoints a student can reach. The outcome shows that the strategic learning approach has some influence but not a very strong one. Also the p-value of 0.08 shows that the significance of the influence is not very strong. The multiple R-squared is a bit higher compared to points ~ attitude regression model so we have a little higher correlations if we put the strategic learning approach into account.
The diagnostic plots show more or less the same results as in the regression model point ~ attitude. Data is normal distributed and no outliers are significantly disturbing the analysis.


2.3.3. Multiple regression model of point ~ deep + surf + stra

In the third regression model I put points vs. deep, surface and strategic learning approach into account. Does the combination of deep, surface and strategic learning approach have an influence in the exampoints of students.

# points ~ + deep + surf + stra ----

de_su_st_lm <- lm(points ~ deep + surf + stra, data = lrn14_analysis)
de_su_st_lm_res <- summary(de_su_st_lm)

de_su_st_lm_res
## 
## Call:
## lm(formula = points ~ deep + surf + stra, data = lrn14_analysis)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -15.1208  -3.0725   0.5196   4.2798  10.3346 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  26.9426     5.1147   5.268 4.34e-07 ***
## deep         -0.7472     0.8659  -0.863   0.3895    
## surf         -1.6328     0.9149  -1.785   0.0762 .  
## stra          0.9850     0.5962   1.652   0.1005    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 5.827 on 162 degrees of freedom
## Multiple R-squared:  0.04072,    Adjusted R-squared:  0.02296 
## F-statistic: 2.292 on 3 and 162 DF,  p-value: 0.0801
knitr::kable(de_su_st_lm_res$coefficients, digits=3, caption="Regression coefficients point ~ deep + surf + stra")
Regression coefficients point ~ deep + surf + stra
Estimate Std. Error t value Pr(>|t|)
(Intercept) 26.943 5.115 5.268 0.000
deep -0.747 0.866 -0.863 0.389
surf -1.633 0.915 -1.785 0.076
stra 0.985 0.596 1.652 0.100

Here the diagnostic plots of the regression model points ~ + deep + surf + stra:

plot(de_su_st_lm, which = c(1,2,5))

Regression model point ~ deep + surf + stra interpretation:

The outcome of this regression model shows that with a minor significance (0.0762) the surface learning approach has some indluence to the exam outcomes of students. Checking the estimate for surf show a negative value of -1.63 which means that students with a surface learning approach tent to have 1.6 points less as a test result with a increased surface learning approach value of 1. The deep learning approach seem to have no influence at all in the amount of points students reach at the exams and the stratetic approach has a non-significant positive influence.
The multiple R-squared of 0.04 is very low and explains that there is more or less no correlation between these variables and the exampoints from the students.

Here I put the three variables into a qplot() to show the relationsships of the learning approaches and the exampoints.

library(ggplot2)
par(mfrow = c(2,2))
qplot(deep, points, data = lrn14_analysis) + geom_smooth(method = "lm") + ggtitle("Points vs. deep learning approach")

qplot(surf, points, data = lrn14_analysis) + geom_smooth(method = "lm")+ ggtitle("Points vs. surface learning approach")

qplot(stra, points, data = lrn14_analysis) + geom_smooth(method = "lm")+ ggtitle("Points vs. strategic learning approach")

So, this was a trial to analyse the dataset with simple and multiple regression models. I hope the interpretations are easily understood and the diary structure is not confusing you.
I had fun preparing this first statistics and graph chapter and I’m looking forward to learn more…


Chapter 3: Logistic regression

Data wrangling and performing a logistic regression analysis

Work of week 46 (11.11. - 17.11.2019)


1. Data wrangling

R script is available on my github repository. To get to the script click here


2. Analysis

2.1. Read the prepared data set

The working directory is set using setwd() and the data file “alc” prepared in the data wrangling part is read using read.table()function. Afterwards the data frame is checked.

alc <- read.table(file = 
       "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/alc_table.txt", stringsAsFactors = TRUE) 
       # read the data file and leave the binary values as they are

# Check the data frame
dim(alc)
## [1] 382  35
str(alc)
## 'data.frame':    382 obs. of  35 variables:
##  $ school    : Factor w/ 2 levels "GP","MS": 1 1 1 1 1 1 1 1 1 1 ...
##  $ sex       : Factor w/ 2 levels "F","M": 1 1 1 1 1 2 2 1 2 2 ...
##  $ age       : int  18 17 15 15 16 16 16 17 15 15 ...
##  $ address   : Factor w/ 2 levels "R","U": 2 2 2 2 2 2 2 2 2 2 ...
##  $ famsize   : Factor w/ 2 levels "GT3","LE3": 1 1 2 1 1 2 2 1 2 1 ...
##  $ Pstatus   : Factor w/ 2 levels "A","T": 1 2 2 2 2 2 2 1 1 2 ...
##  $ Medu      : int  4 1 1 4 3 4 2 4 3 3 ...
##  $ Fedu      : int  4 1 1 2 3 3 2 4 2 4 ...
##  $ Mjob      : Factor w/ 5 levels "at_home","health",..: 1 1 1 2 3 4 3 3 4 3 ...
##  $ Fjob      : Factor w/ 5 levels "at_home","health",..: 5 3 3 4 3 3 3 5 3 3 ...
##  $ reason    : Factor w/ 4 levels "course","home",..: 1 1 3 2 2 4 2 2 2 2 ...
##  $ nursery   : Factor w/ 2 levels "no","yes": 2 1 2 2 2 2 2 2 2 2 ...
##  $ internet  : Factor w/ 2 levels "no","yes": 1 2 2 2 1 2 2 1 2 2 ...
##  $ guardian  : Factor w/ 3 levels "father","mother",..: 2 1 2 2 1 2 2 2 2 2 ...
##  $ traveltime: int  2 1 1 1 1 1 1 2 1 1 ...
##  $ studytime : int  2 2 2 3 2 2 2 2 2 2 ...
##  $ failures  : int  0 0 2 0 0 0 0 0 0 0 ...
##  $ schoolsup : Factor w/ 2 levels "no","yes": 2 1 2 1 1 1 1 2 1 1 ...
##  $ famsup    : Factor w/ 2 levels "no","yes": 1 2 1 2 2 2 1 2 2 2 ...
##  $ paid      : Factor w/ 2 levels "no","yes": 1 1 2 2 2 2 1 1 2 2 ...
##  $ activities: Factor w/ 2 levels "no","yes": 1 1 1 2 1 2 1 1 1 2 ...
##  $ higher    : Factor w/ 2 levels "no","yes": 2 2 2 2 2 2 2 2 2 2 ...
##  $ romantic  : Factor w/ 2 levels "no","yes": 1 1 1 2 1 1 1 1 1 1 ...
##  $ famrel    : int  4 5 4 3 4 5 4 4 4 5 ...
##  $ freetime  : int  3 3 3 2 3 4 4 1 2 5 ...
##  $ goout     : int  4 3 2 2 2 2 4 4 2 1 ...
##  $ Dalc      : int  1 1 2 1 1 1 1 1 1 1 ...
##  $ Walc      : int  1 1 3 1 2 2 1 1 1 1 ...
##  $ health    : int  3 3 3 5 5 5 3 1 1 5 ...
##  $ absences  : int  5 3 8 1 2 8 0 4 0 0 ...
##  $ G1        : int  2 7 10 14 8 14 12 8 16 13 ...
##  $ G2        : int  8 8 10 14 12 14 12 9 17 14 ...
##  $ G3        : int  8 8 11 14 12 14 12 10 18 14 ...
##  $ alc_use   : num  1 1 2.5 1 1.5 1.5 1 1 1 1 ...
##  $ high_use  : logi  FALSE FALSE TRUE FALSE FALSE FALSE ...
colnames(alc)
##  [1] "school"     "sex"        "age"        "address"    "famsize"   
##  [6] "Pstatus"    "Medu"       "Fedu"       "Mjob"       "Fjob"      
## [11] "reason"     "nursery"    "internet"   "guardian"   "traveltime"
## [16] "studytime"  "failures"   "schoolsup"  "famsup"     "paid"      
## [21] "activities" "higher"     "romantic"   "famrel"     "freetime"  
## [26] "goout"      "Dalc"       "Walc"       "health"     "absences"  
## [31] "G1"         "G2"         "G3"         "alc_use"    "high_use"

The data frame consists of 35 variables with 382 observations.

2.2. The data set

The provided data is from a approach on student achievement in secondary education in two Portugal schools.
It was collected with school reports and questionaires and it includes:

  • student grades
  • demographic features
  • social features
  • school related features

The data consists of two data sets regarding the students’ performance in Mathematics (mat) and Portuguese language (por). In a publication by Cortez & Silva, 2008 this data was already used.

In the data wrangling part of this exercise the two data sets of mat and por were merged into one data set named “alc”. Columns (variables) which were not used in merging the data set were combined by averaging. Since we are interested to analyse the alcohol consumption of the students we created two new variables in the data set:

  • “alc_use” –> calculated average of weekday (“Dalc”) and weekend (“Walc”) alcohol consumption
  • “high_use” –> logical values created as TRUE or FALSE depending if the “alc_use” of a student is higher than 2

All other variables are explained in detail here (check attribute information).


2.3. Performing the analyses

2.3.1. Choose variables for the analyses

The task of this exercise is to figure out the relationships between high and low alcohol consumption and other variables in this data set. I have choosen to run the analyses on following four variables:

  • sex –> student’s sex (binary: ‘F’ - female or ‘M’ - male)
  • failures –> number of past class failures (numeric: n if 1<=n<3, else 4)
  • famrel –> quality of family relationships (numeric: from 1 - very bad to 5 - excellent)
  • absences –> number of school absences (numeric: from 0 to 93)

I have choosen these four variables for the data analyses for following reasons:

  1. The student’s gender (“sex”) might have kind a great influence on the alcohol consumption. Young men tent to drink more alcohol for several reasons, women are able to control themselves (don’t take my steriotypical expressions serious please!).

  2. The number of class failures (“failures”) of a student. Someone who fails in school more often might have a higher alcohol consumption then others (personal opionion) both as a causation of higher use or as a response to failure.

  3. I think the overall quality of family relationships (“famrel”) might have a huge effect on a students alcohol consumption. I know, this is a very, very conservative point of view but in a country like Portugal (catholic, maybe more patriachic family structures) that might have an influence in young peoples alcohol consumption.

  4. The number of school absences. I think if someone is absent from school they maybe spent their time also with drinking alcohol. I know there are other things to do as well, but when I was that age drinking beer was somehow a thing to spent your time.


2.3.2. Explore the distributions of the chosen variables numerically and graphically

The task is to explore the distribution of the variables I have choosen and the relationship with alcohol consumption numerically and graphically.

library(tidyr)
library(dplyr)
library(ggplot2)
library(knitr)
library(kableExtra)

First I check how many female and male students are there:

# check the number of female and male students
alc %>% group_by(sex) %>% summarise(count = n()) -> fm
knitr::kable(fm, caption="Students") %>% kable_styling(bootstrap_options = "hover", full_width = FALSE, position = "center")
Students
sex count
F 198
M 184

2.3.2.1. General data overview

This bar plot gives us an overview over the different variables and might give a hint which variables have a strong relationship with alcohol consumption.


2.3.2.2. General alcohol consumption overview

The graph shows the average alcohol consumption count of the students. 1 is the lowest consumption, 5 the highest consumption. The amount of students with a low consumption is around 130 of 385.


2.3.2.3. High alcohol consumption of students by gender

This graph gives an overview of the alcohol consumption of students devided by gender. With this graph you can see the amount of male or female students with higher alcohol consumption easier. High alcohol consumption (> 2) is indicated with turquise colour, lower alcohol consumption is indicated in red colour. You can see that the amount of male students with a high alcohol consumption is higher compared to female students.


2.3.2.4. High alcohol consumption and student’s class failures

What is seen in this graph is that students with higher alcohol consumptuion tent to have a higher class failing rate. Female students with higher alcohol use have a higher count at 1 failure, male students have a higher count at 2 failures or 3 failures.

Generally, you see that students which have no high alcohol consumption fail less in classes.


2.3.2.5. High alcohol consumption and student’s family relationship status

There are more students with a low alcohol conumption where the family relationship is good (4-5). Interestingly the amount of students with high alcohol consumption is also increasing with the quality of the family relationship status (higher count at status 3-5).


2.3.2.6. High alcohol consumption and student school abcences

High alcohol use seems to have an influence on the rate of school absences. Students with a higher alcohol consumption have an increased absance amount. Females tent to be more absent than males, also if they don’t fall in the group of high alcohol consumers.


2.3.3. Logistic regression - explore the relationship between the choosen variables and the high alcohol consumption

2.3.3.1. Calculation of the logistic regression model

I’m using in the model the four variables I have choosen: sex, failures, famrel and absences

# cv_glm, choosen variables logistic model
cv_glm <- glm(high_use ~ sex + failures + famrel + absences, data = alc, family = "binomial")

# Summary of the model (cv_glm_sum)
cv_glm_sum <- summary(cv_glm)
## 
## Call:
## glm(formula = high_use ~ sex + failures + famrel + absences, 
##     family = "binomial", data = alc)
## 
## Deviance Residuals: 
##     Min       1Q   Median       3Q      Max  
## -2.1174  -0.8376  -0.5867   1.0091   2.1557  
## 
## Coefficients:
##             Estimate Std. Error z value Pr(>|z|)    
## (Intercept) -0.83877    0.53446  -1.569   0.1166    
## sexM         0.99120    0.24540   4.039 5.37e-05 ***
## failures     0.42458    0.18854   2.252   0.0243 *  
## famrel      -0.27632    0.12837  -2.153   0.0314 *  
## absences     0.09052    0.02252   4.020 5.83e-05 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## (Dispersion parameter for binomial family taken to be 1)
## 
##     Null deviance: 465.68  on 381  degrees of freedom
## Residual deviance: 419.78  on 377  degrees of freedom
## AIC: 429.78
## 
## Number of Fisher Scoring iterations: 4
Model coefficients of choosen variables
Estimate Std. Error z value Pr(>|z|)
(Intercept) -0.839 0.534 -1.569 0.117
sexM 0.991 0.245 4.039 0.000
failures 0.425 0.189 2.252 0.024
famrel -0.276 0.128 -2.153 0.031
absences 0.091 0.023 4.020 0.000

The model output shows nice results and also significant one. So which variables influance high alcohol consumption?

  • Being a male student significantly (p = 5.37e-05 ***) increases the probability of high alcohol consumption by ~1.
  • Class failure have a smaller probability but still have a positive effect on high consumption (p = 0.0243 *).
  • The quality of family relationships has a negative effect on high alcohol consumption (p = 0.0314 *).
  • School absences show also a significant positive effect on high alcohol consumption (5.83e-05 ***).

This is a first result. But we are not going to use these results as our final interpretation of the logistic regression. We are checking now the odds ratios of the model.


2.3.3.2. Calculating the odds ratios

Odds ratios and confidental intervals
odds ratios 2.5 % 97.5 %
(Intercept) 0.432 0.149 1.220
sexM 2.694 1.676 4.395
failures 1.529 1.057 2.228
famrel 0.759 0.589 0.976
absences 1.095 1.050 1.147

Interpretation of odds ratios: If the odds ratio is lower than 1 the risk of high alcohol consumption is lower, if it is greater than 1 the risk of high consumption increases. But we also have to have a look on the confidential interals - if the values cross the value of 1 the odds ratio cannot be significant! The odds ratios of this logistic model lead to following interpretation.

  • sexM has a value of 2.7, the confidence interval is between 1.6 and 4.4. So this oods ratio tells us that the male students have a 2.7 times higher probability of consuming alcohol on a higher level.
  • failures has a value of 1.5, the confidence interval is between 1.1 and 2.2. Failing in classes also increases the risk of high alcohol consumption by 1.5. So failing once increases the high alcohol consumption probability by 1.5.
  • famrel has a value of 0.8, the confidence interval is between 0.6 and 1. It seems that the family relationships have a negative effect on high alcohol consumption.
  • absences has a value of 1.095, the confidence interval is between 1.050 and 1.147. The amount of school absences increase the probability of higher alcohol consumption.

2.3.3.3. Explore the predictive power of the model

I calculated the probability of high alcohol use and added the data to the “alc” data frame. Then the prediction of a high alcohol consumption is defined to be a probability greater than 0.5 and also added this outcome to the “alc” data frame.
Here are the last 20 data points with the probability and prediction outputs:

##     age Pstatus failures famrel high_use probability prediction
## 363  17       T        0      3    FALSE  0.28017549      FALSE
## 364  17       T        0      3    FALSE  0.29878836      FALSE
## 365  18       T        0      5    FALSE  0.09793643      FALSE
## 366  18       T        0      4    FALSE  0.15809170      FALSE
## 367  18       T        0      5     TRUE  0.11513474      FALSE
## 368  18       T        0      4    FALSE  0.12520393      FALSE
## 369  17       T        0      4     TRUE  0.33697399      FALSE
## 370  18       T        0      3     TRUE  0.42202546      FALSE
## 371  18       T        0      4    FALSE  0.31608605      FALSE
## 372  17       T        0      4    FALSE  0.33597108      FALSE
## 373  19       T        1      4    FALSE  0.37092103      FALSE
## 374  18       T        1      5     TRUE  0.45736187      FALSE
## 375  18       T        0      5    FALSE  0.10622930      FALSE
## 376  18       T        0      4    FALSE  0.19766624      FALSE
## 377  19       T        1      5    FALSE  0.16593043      FALSE
## 378  18       T        0      4    FALSE  0.14641340      FALSE
## 379  18       T        2      2    FALSE  0.41066768      FALSE
## 380  18       T        0      1    FALSE  0.30079024      FALSE
## 381  17       T        0      2     TRUE  0.49046497      FALSE
## 382  18       T        0      4     TRUE  0.31608605      FALSE
2 x 2 cross tabulation
Prediction
FALSE TRUE
High use
FALSE 253 15
TRUE 82 32

Interpretation of the prediction:
The 2 x 2 cross table gives us the results of the model prediction for high alcohol consumption compared to the real numbers.
The model resulted in 253 correct false results and 82 false negative results. This means that the prediction fails to be correct. In 82 cases the model predicted that the student has no high alcohol consumption, but the student actually has high alcohol consumption. This false negative results should be much smaller.
In 15 cases the model predicts high alcohol consumption but is wrong, that’s a false positive prediction. 253 cases were correctly predicted as false and 32 as true for high alcohol consumption.


2.3.3.4. Graphic visualization of actual values and predictions

The graph represent the 2x2 cross table in a graphical view with the false negative values on top left of the graph. 82 cases predicted at FALSE but with a TRUE high alcohol use.

Target variable versus the predictions
Prediction
FALSE TRUE Sum
high use
FALSE 0.6623 0.0393 0.7016
TRUE 0.2147 0.0838 0.2984
Sum 0.8770 0.1230 1.0000

Is the prediction of high alcohol use correct of not?
This propability table gives us the correct predictions, false negative and false positive values as fractions of the total number. Results in fraction values between 0 and 1.
So 21.5 % of are false negative predictions! 3.9 % are false positive predictions.


2.3.3.5. Computing the loss function - testing error

To measure the performance of of the logistic regression model we perform the loss function to calculate the average number of wrong predictions of our model.

This results to following value:

## [1] 0.2539267

So 25.4% of all predictions are wrong! Every fourth prediction is incorrect (to check that you can calculate the sum of false positive and false negative values from the Target variable versus the predictions table: 21.5 + 3.9 = 25.4).
So on the training data the model has a prediction error of 25.4 %.


2.3.3.6. Cross validation - the training error

Here we are testing the how good our model is on unseen data. The loss function value is computed on a data which we did not use to to train the model. If the value is low it is good! We are performing the cross validation so that we test the logistic regression models on a defined set of observations. Here we choose the value of 10 - so the dataset will be split in 10 groups and 1 group of these 10 will be tested.

# K-fold cross-validation
library(boot)
cross_val<- cv.glm(data = alc, cost = loss_func, glmfit = cv_glm, K = 10)

The cross validation results to following value:

# average number of wrong predictions in the cross validation
cross_val$delta[1]
## [1] 0.2591623

This is the average number of wrong predictions in the cross validation. The prediction error on the testing data is higher compared to the training data!

Be aware of it that this value is changing with each new computing because the function computes everytime another set of observations with the logistic regression model.

The cross validation tells us if the model is too much in the data –> if the model is more generalized the value of cross validation shall be higher than the loss function value.

With the cross validation you check your model with the same data - the outcome should be higher number than the loss function –> that shows that the model is not too much in the data –> so the model is more kind of a general one!


Bonus tasks

Bonus task 1

The 10-fold cross-validation of my model was computed in section 2.3.3.6. above. My logistic model does not have a better performance compared to the DataCamp model (both have an error value of ~ 0.26).

I think a better model could be found - maybe more variables need to be used in the model, for example the age of the students or if students are in a relationship. These factors could make the model more accurate.


Bonus task 2

Cross-validations with different logistic regression models

Model 1: 8 variables (sex, age, Pstatus, guardian, failures, famrel, freetime, absences)

Model 2: 7 variables (sex, age, guardian, failures, famrel, freetime, absences) - Pstatus

Model 3: 6 variables (sex, age, failures, famrel, freetime, absences) - Pstatus, guardian

Model 4: 5 variables (sex, age, failures, freetime, absences) - Pstatus, guardian, famrel

Model 5: 4 variables (sex, age, failures, absences) - Pstatus, guardian, famrel, freetime

Model 6: 3 variables (sex, failures, absences) - Pstatus, guardian, famrel, freetime, age

The graph shows a red line representing the training error values and a green line representing the testing error values. You can see how the error values are changing. As more variables are used for the logistic regression model as higher the training and testing error becomes. It seems the model is then too “deep” in the data.



Chapter 4: Clustering and classification

Data wrangling and performing clustering and classification

Work of week 47 (18.11. - 24.11.2019)


1. Analysis of Boston data set

1.1. Load the data set & check the structure

# load necessary packages
library(MASS) # package includes the Boston data set
library(tidyr)
library(dplyr)
library(corrplot)
library(ggplot2)
library(GGally)
library(knitr)
library(kableExtra)

data(Boston) # load the Boston data set

1.2. The data set

The Boston data set consists of 14 variables with 506 observations. The data is bout housing values in Boston suburbs. The data set has following variables (columns):

  • crim: the per capita crime rate by town
  • zn: proportion of residential land zoned for lots over 25,000 sq.ft
  • indus: proportion of non-retail business acres per town
  • chas: Charles River dummy variable (= 1 if tract bounds river; 0 otherwise)
  • nox: nitrogen oxides concentration (parts per 10 million)
  • rm: average number of rooms per dwelling
  • age: proportion of owner-occupied units built prior to 1940
  • dis: weighted mean of distances to five Boston employment centres
  • rad: index of accessibility to radial highways
  • tax: full-value property-tax rate per $10,000
  • ptratio: pupil-teacher ratio by town
  • black: 1000(Bk - 0.63)^2 where Bk is the proportion of blacks by town
  • lstat: lower status of the population (percent)
  • medv: median value of owner-occupied homes in $1000s

We are interested in the later analysis on the per capita crime rate. Overall the crime rate is low and the data distribution is high on the low crime rate.

str(Boston) # check the structure
## 'data.frame':    506 obs. of  14 variables:
##  $ crim   : num  0.00632 0.02731 0.02729 0.03237 0.06905 ...
##  $ zn     : num  18 0 0 0 0 0 12.5 12.5 12.5 12.5 ...
##  $ indus  : num  2.31 7.07 7.07 2.18 2.18 2.18 7.87 7.87 7.87 7.87 ...
##  $ chas   : int  0 0 0 0 0 0 0 0 0 0 ...
##  $ nox    : num  0.538 0.469 0.469 0.458 0.458 0.458 0.524 0.524 0.524 0.524 ...
##  $ rm     : num  6.58 6.42 7.18 7 7.15 ...
##  $ age    : num  65.2 78.9 61.1 45.8 54.2 58.7 66.6 96.1 100 85.9 ...
##  $ dis    : num  4.09 4.97 4.97 6.06 6.06 ...
##  $ rad    : int  1 2 2 3 3 3 5 5 5 5 ...
##  $ tax    : num  296 242 242 222 222 222 311 311 311 311 ...
##  $ ptratio: num  15.3 17.8 17.8 18.7 18.7 18.7 15.2 15.2 15.2 15.2 ...
##  $ black  : num  397 397 393 395 397 ...
##  $ lstat  : num  4.98 9.14 4.03 2.94 5.33 ...
##  $ medv   : num  24 21.6 34.7 33.4 36.2 28.7 22.9 27.1 16.5 18.9 ...
knitr::kable(head(Boston)) %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") # the data frame head
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
0.00632 18 2.31 0 0.538 6.575 65.2 4.0900 1 296 15.3 396.90 4.98 24.0
0.02731 0 7.07 0 0.469 6.421 78.9 4.9671 2 242 17.8 396.90 9.14 21.6
0.02729 0 7.07 0 0.469 7.185 61.1 4.9671 2 242 17.8 392.83 4.03 34.7
0.03237 0 2.18 0 0.458 6.998 45.8 6.0622 3 222 18.7 394.63 2.94 33.4
0.06905 0 2.18 0 0.458 7.147 54.2 6.0622 3 222 18.7 396.90 5.33 36.2
0.02985 0 2.18 0 0.458 6.430 58.7 6.0622 3 222 18.7 394.12 5.21 28.7
knitr::kable(summary(Boston)) %>% 
  kable_styling(bootstrap_options = "striped", position = "center", font_size = 11) %>% 
  scroll_box(width = "100%", height = "300px")# summary statistics
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
Min. : 0.00632 Min. : 0.00 Min. : 0.46 Min. :0.00000 Min. :0.3850 Min. :3.561 Min. : 2.90 Min. : 1.130 Min. : 1.000 Min. :187.0 Min. :12.60 Min. : 0.32 Min. : 1.73 Min. : 5.00
1st Qu.: 0.08204 1st Qu.: 0.00 1st Qu.: 5.19 1st Qu.:0.00000 1st Qu.:0.4490 1st Qu.:5.886 1st Qu.: 45.02 1st Qu.: 2.100 1st Qu.: 4.000 1st Qu.:279.0 1st Qu.:17.40 1st Qu.:375.38 1st Qu.: 6.95 1st Qu.:17.02
Median : 0.25651 Median : 0.00 Median : 9.69 Median :0.00000 Median :0.5380 Median :6.208 Median : 77.50 Median : 3.207 Median : 5.000 Median :330.0 Median :19.05 Median :391.44 Median :11.36 Median :21.20
Mean : 3.61352 Mean : 11.36 Mean :11.14 Mean :0.06917 Mean :0.5547 Mean :6.285 Mean : 68.57 Mean : 3.795 Mean : 9.549 Mean :408.2 Mean :18.46 Mean :356.67 Mean :12.65 Mean :22.53
3rd Qu.: 3.67708 3rd Qu.: 12.50 3rd Qu.:18.10 3rd Qu.:0.00000 3rd Qu.:0.6240 3rd Qu.:6.623 3rd Qu.: 94.08 3rd Qu.: 5.188 3rd Qu.:24.000 3rd Qu.:666.0 3rd Qu.:20.20 3rd Qu.:396.23 3rd Qu.:16.95 3rd Qu.:25.00
Max. :88.97620 Max. :100.00 Max. :27.74 Max. :1.00000 Max. :0.8710 Max. :8.780 Max. :100.00 Max. :12.127 Max. :24.000 Max. :711.0 Max. :22.00 Max. :396.90 Max. :37.97 Max. :50.00

1.2.1 Overview plot and data description

# graphical overview of the Boston data set
ov_boston <- ggpairs(Boston, mapping = aes(), title ="Overview of the Boston data set", 
                     lower = list(combo = wrap("facethist", bins = 20)), 
                     upper = list(continuous = wrap("cor", size = 2.8)))
# overview plot of interesting data
bos_detail <- Boston[,c("crim","dis","tax","medv")]

ov_bos_detail <- ggpairs(bos_detail, mapping = aes(),lower = list(combo = wrap("facethist", bins = 20)), 
                     upper = list(continuous = wrap("cor", size = 5)))
ov_bos_detail

The overview plot of the interesting data (for me) gives following information and the histograms give following information: the crime rate has the highest distribution on the low crime rate level. There might be some relationship with the distance to employment center. The tax rate has a high count on a lower level, decreases with a higher tax and has again a higher count on high taxes. The median value of the owner-occupied homes has a maximum count around 20000$. Tax and crime rate show a correlation of 50%.


1.2.2. Data correlations of the Boston data set

# calculate the correlation matrix and round it
cor_matrix<-cor(Boston) %>% round(digits = 2)

# print the correlation matrix
knitr::kable(cor_matrix, caption="Correlation matrix values") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% 
  scroll_box(width = "100%", height = "300px")# summary statistics
Correlation matrix values
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
crim 1.00 -0.20 0.41 -0.06 0.42 -0.22 0.35 -0.38 0.63 0.58 0.29 -0.39 0.46 -0.39
zn -0.20 1.00 -0.53 -0.04 -0.52 0.31 -0.57 0.66 -0.31 -0.31 -0.39 0.18 -0.41 0.36
indus 0.41 -0.53 1.00 0.06 0.76 -0.39 0.64 -0.71 0.60 0.72 0.38 -0.36 0.60 -0.48
chas -0.06 -0.04 0.06 1.00 0.09 0.09 0.09 -0.10 -0.01 -0.04 -0.12 0.05 -0.05 0.18
nox 0.42 -0.52 0.76 0.09 1.00 -0.30 0.73 -0.77 0.61 0.67 0.19 -0.38 0.59 -0.43
rm -0.22 0.31 -0.39 0.09 -0.30 1.00 -0.24 0.21 -0.21 -0.29 -0.36 0.13 -0.61 0.70
age 0.35 -0.57 0.64 0.09 0.73 -0.24 1.00 -0.75 0.46 0.51 0.26 -0.27 0.60 -0.38
dis -0.38 0.66 -0.71 -0.10 -0.77 0.21 -0.75 1.00 -0.49 -0.53 -0.23 0.29 -0.50 0.25
rad 0.63 -0.31 0.60 -0.01 0.61 -0.21 0.46 -0.49 1.00 0.91 0.46 -0.44 0.49 -0.38
tax 0.58 -0.31 0.72 -0.04 0.67 -0.29 0.51 -0.53 0.91 1.00 0.46 -0.44 0.54 -0.47
ptratio 0.29 -0.39 0.38 -0.12 0.19 -0.36 0.26 -0.23 0.46 0.46 1.00 -0.18 0.37 -0.51
black -0.39 0.18 -0.36 0.05 -0.38 0.13 -0.27 0.29 -0.44 -0.44 -0.18 1.00 -0.37 0.33
lstat 0.46 -0.41 0.60 -0.05 0.59 -0.61 0.60 -0.50 0.49 0.54 0.37 -0.37 1.00 -0.74
medv -0.39 0.36 -0.48 0.18 -0.43 0.70 -0.38 0.25 -0.38 -0.47 -0.51 0.33 -0.74 1.00
# Specialized the insignificant value according to the significant level
p.mat <- cor.mtest(cor_matrix)$p

# visualize the correlation matrix
# correlations / colour shows the correlation values
corrplot(cor_matrix, method="circle", type="upper",  tl.cex = 0.6, p.mat = p.mat, sig.level = 0.01, title="Correlations of the Boston data set", mar=c(0,0,1,0))  

Insignificant values are shown with a cross in the square. The crime rate shows rather strong positive correlation with accessibility to highways, the property-tax rate and with the lower population status.


1.3. Data set analysis

1.3.1. Data standardization (scaling of the data set)

Here we standardize the data set and print out summaries of the scaled data set. The data scaling is a method to standardize the values (some kind of transformation), so that we can compare the observations of the different variables. That creates negative and positive values with an overall mean of 0 and a standard deviation of 1.

# center and standardize variables
boston_scaled <- scale(Boston)

# summaries of the scaled variables
knitr::kable(summary(boston_scaled), caption="Summary of scaled Boston data set") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center", font_size = 11)  %>% 
  scroll_box(width = "100%", height = "300px")
Summary of scaled Boston data set
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
Min. :-0.419367 Min. :-0.48724 Min. :-1.5563 Min. :-0.2723 Min. :-1.4644 Min. :-3.8764 Min. :-2.3331 Min. :-1.2658 Min. :-0.9819 Min. :-1.3127 Min. :-2.7047 Min. :-3.9033 Min. :-1.5296 Min. :-1.9063
1st Qu.:-0.410563 1st Qu.:-0.48724 1st Qu.:-0.8668 1st Qu.:-0.2723 1st Qu.:-0.9121 1st Qu.:-0.5681 1st Qu.:-0.8366 1st Qu.:-0.8049 1st Qu.:-0.6373 1st Qu.:-0.7668 1st Qu.:-0.4876 1st Qu.: 0.2049 1st Qu.:-0.7986 1st Qu.:-0.5989
Median :-0.390280 Median :-0.48724 Median :-0.2109 Median :-0.2723 Median :-0.1441 Median :-0.1084 Median : 0.3171 Median :-0.2790 Median :-0.5225 Median :-0.4642 Median : 0.2746 Median : 0.3808 Median :-0.1811 Median :-0.1449
Mean : 0.000000 Mean : 0.00000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000
3rd Qu.: 0.007389 3rd Qu.: 0.04872 3rd Qu.: 1.0150 3rd Qu.:-0.2723 3rd Qu.: 0.5981 3rd Qu.: 0.4823 3rd Qu.: 0.9059 3rd Qu.: 0.6617 3rd Qu.: 1.6596 3rd Qu.: 1.5294 3rd Qu.: 0.8058 3rd Qu.: 0.4332 3rd Qu.: 0.6024 3rd Qu.: 0.2683
Max. : 9.924110 Max. : 3.80047 Max. : 2.4202 Max. : 3.6648 Max. : 2.7296 Max. : 3.5515 Max. : 1.1164 Max. : 3.9566 Max. : 1.6596 Max. : 1.7964 Max. : 1.6372 Max. : 0.4406 Max. : 3.5453 Max. : 2.9865
# change the object to data frame
boston_scaled <- as.data.frame(boston_scaled)

knitr::kable(boston_scaled, caption="Values of scaled Boston data set") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% scroll_box(width = "100%", height = "300px")
Values of scaled Boston data set
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
-0.4193669 0.2845483 -1.2866362 -0.2723291 -0.1440749 0.4132629 -0.1198948 0.1400750 -0.9818712 -0.6659492 -1.4575580 0.4406159 -1.0744990 0.1595278
-0.4169267 -0.4872402 -0.5927944 -0.2723291 -0.7395304 0.1940824 0.3668034 0.5566090 -0.8670245 -0.9863534 -0.3027945 0.4406159 -0.4919525 -0.1014239
-0.4169290 -0.4872402 -0.5927944 -0.2723291 -0.7395304 1.2814456 -0.2655490 0.5566090 -0.8670245 -0.9863534 -0.3027945 0.3960351 -1.2075324 1.3229375
-0.4163384 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.0152978 -0.8090878 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4157514 -1.3601708 1.1815886
-0.4120741 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.2273620 -0.5106743 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4406159 -1.0254866 1.4860323
-0.4166314 -0.4872402 -1.3055857 -0.2723291 -0.8344581 0.2068916 -0.3508100 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4101651 -1.0422909 0.6705582
-0.4098372 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3880270 -0.0701592 0.8384142 -0.5224844 -0.5769480 -1.5037485 0.4263763 -0.0312367 0.0399249
-0.4032966 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.1603069 0.9778406 1.0236249 -0.5224844 -0.5769480 -1.5037485 0.4406159 0.9097999 0.4965904
-0.3955433 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.9302853 1.1163897 1.0861216 -0.5224844 -0.5769480 -1.5037485 0.3281233 2.4193794 -0.6559463
-0.4003331 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3994130 0.6154813 1.3283202 -0.5224844 -0.5769480 -1.5037485 0.3289995 0.6227277 -0.3949946
-0.3939564 0.0487240 -0.4761823 -0.2723291 -0.2648919 0.1314594 0.9138948 1.2117800 -0.5224844 -0.5769480 -1.5037485 0.3926395 1.0918456 -0.8190411
-0.4064448 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3922967 0.5089051 1.1547920 -0.5224844 -0.5769480 -1.5037485 0.4406159 0.0863929 -0.3949946
-0.4091990 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.5630867 -1.0506607 0.7863653 -0.5224844 -0.5769480 -1.5037485 0.3705134 0.4280788 -0.0905509
-0.3468869 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4776917 -0.2406812 0.4333252 -0.6373311 -0.6006817 1.1753027 0.4406159 -0.6151835 -0.2318998
-0.3459336 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 0.5657458 0.3166900 -0.6373311 -0.6006817 1.1753027 0.2557205 -0.3351131 -0.4711055
-0.3471625 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6413655 -0.4289659 0.3341188 -0.6373311 -0.6006817 1.1753027 0.4265954 -0.5857761 -0.2862647
-0.2975737 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4976172 -1.3952572 0.3341188 -0.6373311 -0.6006817 1.1753027 0.3305330 -0.8504426 0.0616709
-0.3289320 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4193385 0.4662746 0.2198106 -0.6373311 -0.6006817 1.1753027 0.3294377 0.2824421 -0.5472164
-0.3267801 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.1793541 -1.1359217 0.0006921 -0.6373311 -0.6006817 1.1753027 -0.7413783 -0.1348628 -0.2536457
-0.3357215 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.7936533 0.0328645 0.0006921 -0.6373311 -0.6006817 1.1753027 0.3754425 -0.1922772 -0.4711055
-0.2745709 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.0171035 1.0488914 0.0013569 -0.6373311 -0.6006817 1.1753027 0.2179309 1.1716657 -0.9712629
-0.3210451 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4549197 0.7327152 0.1031753 -0.6373311 -0.6006817 1.1753027 0.3927490 0.1648126 -0.3188837
-0.2768170 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2030044 0.8215287 0.0863639 -0.6373311 -0.6006817 1.1753027 0.4406159 0.8495847 -0.7972951
-0.3051886 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 1.1163897 0.1425445 -0.6373311 -0.6006817 1.1753027 0.4147656 1.0120256 -0.8734060
-0.3328778 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.5132730 0.9067897 0.2871038 -0.6373311 -0.6006817 1.1753027 0.4124654 0.5106995 -0.7538032
-0.3223820 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.9758293 0.6083763 0.3132232 -0.6373311 -0.6006817 1.1753027 -0.5833190 0.5401069 -0.9386440
-0.3419866 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 0.7717932 0.4212153 -0.6373311 -0.6006817 1.1753027 0.2213265 0.3020471 -0.6450733
-0.3089856 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.3382132 0.7185050 0.3126533 -0.6373311 -0.6006817 1.1753027 -0.5508966 0.6479340 -0.8407871
-0.3302353 -0.4872402 -0.4368257 -0.2723291 -0.1440749 0.2994029 0.9174474 0.3132707 -0.6373311 -0.6006817 1.1753027 0.3424724 0.0205763 -0.4493595
-0.3035587 -0.4872402 -0.4368257 -0.2723291 -0.1440749 0.5541647 0.6652169 0.2108350 -0.6373311 -0.6006817 1.1753027 0.2580207 -0.0942525 -0.1666618
-0.2886358 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8135788 0.9067897 0.2079856 -0.6373311 -0.6006817 1.1753027 0.0382932 1.3929213 -1.0691198
-0.2626044 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.3026319 1.1163897 0.1804414 -0.6373311 -0.6006817 1.1753027 0.2196834 0.0541848 -0.8734060
-0.2587365 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4762685 0.4769322 0.0925851 -0.6373311 -0.6006817 1.1753027 -1.3590472 2.1085012 -1.0147549
-0.2862048 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8306578 0.9387626 -0.0037245 -0.6373311 -0.6006817 1.1753027 0.0229582 0.7977717 -1.0256279
-0.2325982 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 1.0062609 -0.0167367 -0.6373311 -0.6006817 1.1753027 -1.1869674 1.0764418 -0.9821359
-0.4126414 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.5004637 -0.0133185 -0.2064589 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.4163335 -0.3949946
-0.4087735 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6314027 -0.2548913 -0.1981007 -0.5224844 -0.7668172 0.3438730 0.2287748 -0.1740726 -0.2753917
-0.4107848 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6185935 -0.9618471 0.0660857 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.5437656 -0.1666618
-0.3997507 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.4534965 -1.3632843 0.0248170 -0.5224844 -0.7668172 0.3438730 0.4026072 -0.3533177 0.2356387
-0.4168895 2.7285450 -1.1933466 -0.2723291 -1.0933517 0.4417279 -1.6616978 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4267049 -1.1669222 0.8988910
-0.4161966 2.7285450 -1.1933466 -0.2723291 -1.0933517 1.0523023 -1.8748503 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4265954 -1.4946046 1.3446835
-0.4052857 -0.4872402 -0.6161168 -0.2723291 -0.9207559 0.6907967 -2.3331282 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.3147600 -1.0941039 0.4422255
-0.4036511 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1645767 -2.2016841 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.2924148 -0.9582698 0.3008766
-0.4015748 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1048002 -2.2052367 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.4138893 -0.7300124 0.2356387
-0.4058380 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.3069017 -1.0151352 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.3583550 -0.4345381 -0.1449159
-0.4001727 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.8576995 -1.2353928 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 -0.3421149 -0.3515026
-0.3982033 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.7096815 -1.2531555 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.2096238 -0.2753917
-0.3934472 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.3624084 0.6012712 0.8996287 -0.7521778 -1.0397541 -0.2566040 0.3950493 0.8607875 -0.6450733
-0.3905872 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -1.2604793 0.9494202 0.9853955 -0.7521778 -1.0397541 -0.2566040 0.4406159 2.5426103 -0.8842790
-0.3945516 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.9715595 -0.2335761 1.0887811 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.4966960 -0.3406296
-0.4097861 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4577662 -0.8126404 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4259382 0.1115992 -0.3080107
-0.4150596 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.2414322 -0.1980507 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4085221 -0.4513423 -0.2210268
-0.4138702 0.4131797 -0.8012385 -0.2723291 -0.9984241 0.3221749 -1.6865656 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -1.0324884 0.2682577
-0.4143109 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4079525 -1.6759080 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -0.5913775 0.0942899
-0.4185206 2.7285450 -1.0402932 -0.2723291 -1.2486880 -0.5645100 -0.7451421 1.6738568 -0.7521778 0.3605309 1.2214933 0.4406159 0.3006467 -0.3949946
-0.4185775 3.3717021 -1.4455202 -0.2723291 -1.3090965 1.3725336 -1.6581453 2.3277455 -0.5224844 -1.0812880 -0.2566040 0.4299910 -1.0983050 1.3990484
-0.4177126 3.1573164 -1.5154874 -0.2723291 -1.2486880 0.1399989 -1.1678945 2.5609210 -0.8670245 -0.5650812 -0.5337472 0.4406159 -0.9638712 0.2356387
-0.4184369 3.8004735 -1.4309437 -0.2723291 -1.2400582 0.7562662 -0.9973725 2.1511780 -0.5224844 -0.9032856 -1.5499390 0.3968018 -1.2187352 0.9858749
-0.4021456 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.1987347 -1.3988097 1.9089794 -0.1779443 -0.7371501 0.5748257 0.3724850 -0.8112328 0.0834169
-0.4080945 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.5090032 -0.7593522 1.4897384 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.4807497 -0.3188837
-0.4027420 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.7737278 -0.0843694 1.6290739 -0.1779443 -0.7371501 0.5748257 0.4210091 0.0695886 -0.4167406
-0.4001390 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.4534965 0.8819220 1.4358374 -0.1779443 -0.7371501 0.5748257 0.2344707 0.2502341 -0.7103112
-0.4072819 0.5846882 -0.8755787 -0.2723291 -0.8776070 0.2438961 -0.0275287 1.6291213 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.8294374 -0.0361860
-0.4053950 0.5846882 -0.8755787 -0.2723291 -0.8776070 0.6794107 -0.8943488 1.9878602 -0.1779443 -0.7371501 0.5748257 0.4261573 -0.4415399 0.2682577
-0.4178335 0.2631097 -1.4221978 -0.2723291 -1.1960462 1.1661623 -0.3223896 2.5776850 -0.7521778 -1.1406221 0.0667298 0.4005260 -0.6445909 1.1380967
-0.4159350 2.9429307 -1.1321252 -0.2723291 -1.3522454 0.0076366 -1.8037995 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -1.1179099 0.1051628
-0.4150107 2.9429307 -1.1321252 -0.2723291 -1.3522454 -0.7082582 -1.3313114 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -0.3379138 -0.3406296
-0.4133715 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5787425 -1.6759080 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4330580 -0.6375891 -0.0579320
-0.4043440 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.9829455 -1.1288166 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 0.0611865 -0.5580894
-0.4052020 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5687797 -1.2638131 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 -0.5409648 -0.1775348
-0.4098407 -0.4872402 -0.0476329 -0.2723291 -1.2227986 0.1883894 -2.2016841 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2963581 -0.8308377 0.1812738
-0.4016445 -0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.4606127 -1.8144571 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2219837 -0.3883265 -0.0905509
-0.4094478 -0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.3125947 -2.1590536 0.7086718 -0.6373311 -0.6125485 0.3438730 0.3750043 -0.9988800 0.0290519
-0.3973860 -0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.0564097 -2.2158943 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2245030 -0.7160089 0.0942899
-0.4109219 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0165586 -2.2229994 0.2167712 -0.5224844 -0.0607412 0.1129203 0.4189279 -0.8224356 0.1704008
-0.4090432 -0.4872402 0.2468126 -0.2723291 -1.0156836 0.0019436 -0.8375082 0.3360184 -0.5224844 -0.0607412 0.1129203 0.2908813 -0.5199596 -0.1231699
-0.4082980 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0080191 0.2104916 0.1221238 -0.5224844 -0.0607412 0.1129203 0.1860561 -0.0956529 -0.2753917
-0.4099791 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.2058509 -0.8090878 0.1403124 -0.5224844 -0.0607412 0.1129203 0.3317379 -0.3337127 -0.1884078
-0.4135377 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0749119 -0.5284370 0.5789293 -0.5224844 -0.0607412 0.1129203 0.3256039 -0.0438399 -0.1449159
-0.4103511 -0.4872402 0.2468126 -0.2723291 -1.0156836 -0.5844355 -1.1359217 0.3360184 -0.5224844 -0.0607412 0.1129203 0.4314149 -0.4975539 -0.2427728
-0.4153200 0.5846882 -0.9149352 -0.2723291 -1.1106113 0.6295970 -1.2460504 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -1.0310881 0.5944473
-0.4149142 0.5846882 -0.9149352 -0.2723291 -1.1106113 0.4758859 0.0648374 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4267049 -0.7608201 0.1486548
-0.4158478 0.5846882 -0.9149352 -0.2723291 -1.1106113 0.0247156 -1.2922335 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -0.8308377 0.2465117
-0.4159734 0.5846882 -0.9149352 -0.2723291 -1.1106113 -0.1674232 -0.7771150 0.7625253 -0.6373311 -0.7549503 0.2514920 0.3720469 -0.7202099 0.0399249
-0.4142202 -0.4872402 -0.9688683 -0.2723291 -0.9121262 0.1485384 -0.7309319 0.4674705 -0.7521778 -0.9566863 0.0205393 0.4406159 -0.4247356 0.1486548
-0.4134343 -0.4872402 -0.9688683 -0.2723291 -0.9121262 0.4915417 -0.4431760 0.3051974 -0.7521778 -0.9566863 0.0205393 0.3902297 -0.8574444 0.4422255
-0.4140702 -0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.3837572 -0.8339556 0.3002110 -0.7521778 -0.9566863 0.0205393 0.4306482 0.0289784 -0.0035670
-0.4117881 -0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.2328927 -0.4183083 -0.0225305 -0.7521778 -0.9566863 0.0205393 0.4214472 -0.5899772 -0.0361860
-0.4135215 -0.4872402 -1.1262946 -0.2723291 -0.5669346 1.0281070 0.6296915 -0.1773001 -0.8670245 -0.8202179 -0.3027945 0.4406159 -1.0016807 0.1160358
-0.4139377 -0.4872402 -1.1262946 -0.2723291 -0.5669346 1.1305810 -0.1944981 -0.1807194 -0.8670245 -0.8202179 -0.3027945 0.4314149 -0.9736736 0.6705582
-0.4146561 -0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1883894 -0.0879219 -0.3337319 -0.8670245 -0.8202179 -0.3027945 0.3889153 -0.5381641 0.0073060
-0.4155304 -0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1713104 0.1891763 -0.3338269 -0.8670245 -0.8202179 -0.3027945 0.4039216 -0.6235856 -0.0579320
-0.4152153 0.7133196 0.5689534 -0.2723291 -0.7826793 0.2239706 -0.5319896 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4199137 -0.6291870 0.0399249
-0.4167593 0.7133196 0.5689534 -0.2723291 -0.7826793 -0.1048002 -1.4094674 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4343724 -0.9022557 0.2682577
-0.4151096 0.7133196 0.5689534 -0.2723291 -0.7826793 -0.0507166 0.3099628 -0.0855021 -0.6373311 -0.8202179 -0.1180323 0.4406159 -0.2889015 -0.2101538
-0.4059135 -0.4872402 -1.2020925 -0.2723291 -0.9466453 0.4844254 -0.3827828 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.0143049 -0.8406402 0.6379392
-0.4067273 -0.4872402 -1.2020925 -0.2723291 -0.9466453 -0.1731162 0.0364171 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.3850816 -0.1838751 -0.1231699
-0.4060542 -0.4872402 -1.2020925 -0.2723291 -0.9466453 2.5395987 0.2637797 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -1.1823261 1.7578570
-0.4105836 -0.4872402 -1.2020925 -0.2723291 -0.9466453 2.1852094 -1.1252640 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4037025 -1.2719486 2.3123794
-0.4121264 -0.4872402 -1.2020925 -0.2723291 -0.9466453 1.6102164 -0.2158134 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -0.9050564 1.1598427
-0.4028187 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.6295970 0.4023288 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4171754 -0.4527427 0.5400824
-0.4068110 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.7064525 0.0968103 -0.4459031 -0.5224844 -0.1438090 1.1291122 0.4261573 -0.6978043 0.4313525
-0.3935065 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.1713104 0.5977186 -0.5130539 -0.5224844 -0.1438090 1.1291122 -3.1313265 -0.2833001 -0.4276135
-0.3955003 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2101207 0.6687695 -0.5130539 -0.5224844 -0.1438090 1.1291122 0.4139988 0.1101988 -0.3515026
-0.4038720 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1674232 0.7611355 -0.6525317 -0.5224844 -0.1438090 1.1291122 0.3945016 -0.0452402 -0.2645187
-0.4046835 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6171702 0.9991558 -0.8016976 -0.5224844 -0.1438090 1.1291122 0.4093984 0.5345055 -0.3297567
-0.4001983 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6385190 0.8286338 -0.7522606 -0.5224844 -0.1438090 1.1291122 0.4271431 0.8411826 -0.3297567
-0.4048521 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2243532 0.5906135 -0.7943366 -0.5224844 -0.1438090 1.1291122 0.3397340 0.2012217 -0.2318998
-0.4052183 -0.4872402 -0.3756044 -0.2723291 -0.2994111 0.2695146 1.0133660 -0.6468804 -0.5224844 -0.1438090 1.1291122 0.4224331 -0.0536423 -0.2971377
-0.3894525 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.0791817 0.8037660 -0.5935968 -0.5224844 -0.1438090 1.1291122 0.3785094 0.4056731 -0.3406296
-0.4075539 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1275722 -0.5035693 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4032644 0.0485834 -0.0905509
-0.4083782 -0.4872402 -0.1642450 -0.2723291 -0.0664067 0.6125180 0.4627220 -0.5307201 -0.4076377 0.1409947 -0.3027945 0.4262668 -0.3491166 0.0290519
-0.4057682 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5289287 0.8641592 -0.6846349 -0.4076377 0.1409947 -0.3027945 0.4192565 0.4980964 -0.4058676
-0.3942784 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.2741669 0.9529728 -0.5922195 -0.4076377 0.1409947 -0.3027945 0.4406159 0.6213273 -0.4167406
-0.4035570 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.0436004 0.5550881 -0.7306527 -0.4076377 0.1409947 -0.3027945 0.3512352 -0.3085064 -0.4384865
-0.4001820 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5075800 0.6971898 -0.6325385 -0.4076377 0.1409947 -0.3027945 -0.1288575 0.4350805 -0.4602325
-0.4048044 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.1546139 0.1394408 -0.5057404 -0.4076377 0.1409947 -0.3027945 0.4011832 -0.0858504 -0.1449159
-0.4025490 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.3752177 0.4982475 -0.4975246 -0.4076377 0.1409947 -0.3027945 0.4144370 -0.3295117 -0.3623756
-0.4049207 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5872820 0.1607560 -0.6256999 -0.4076377 0.1409947 -0.3027945 -0.1976456 0.3804668 -0.2318998
-0.4032721 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.7879603 -0.1198948 -0.4919208 -0.4076377 0.1409947 -0.3027945 0.3814669 0.1340048 -0.3515026
-0.4120810 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.5901285 0.0399696 -0.7300828 -0.8670245 -1.3067576 0.2976825 0.3557261 0.2404316 -0.0579320
-0.4117718 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.3994130 0.5515356 -0.7587192 -0.8670245 -1.3067576 0.2976825 0.2299797 0.2264281 -0.2427728
-0.4092908 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.4606127 0.8641592 -0.8111956 -0.8670245 -1.3067576 0.2976825 0.2345802 0.7389569 -0.2210268
-0.4026188 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.6100540 1.0098134 -0.8788687 -0.8670245 -1.3067576 0.2976825 0.1493618 1.7864202 -0.5689624
-0.4086514 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.5773192 0.9671829 -0.8494724 -0.8670245 -1.3067576 0.2976825 0.2487102 0.6899446 -0.4058676
-0.4004517 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.4250315 0.7042949 -0.8558361 -0.8670245 -1.3067576 0.2976825 0.3104881 0.3020471 -0.1231699
-0.3750691 -0.4872402 2.1155211 -0.2723291 0.2270061 -0.9559038 0.9600779 -0.9677698 -0.8670245 -1.3067576 0.2976825 0.0286541 2.0454854 -0.7429302
-0.3899734 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.8420438 0.9742880 -0.9530004 -0.6373311 0.1706618 1.2676838 0.3881485 0.6353309 -0.6885653
-0.3822678 -0.4872402 1.5674443 -0.2723291 0.5980871 0.2083149 1.0737592 -0.9415079 -0.6373311 0.1706618 1.2676838 0.4406159 0.3832675 -0.4928515
-0.3176492 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.9217458 0.9281050 -0.8620098 -0.6373311 0.1706618 1.2676838 0.4406159 0.7963713 -0.8951520
-0.3805669 -0.4872402 1.5674443 -0.2723291 0.5980871 0.2467426 1.0773117 -0.7961887 -0.6373311 0.1706618 1.2676838 0.4202424 -0.0074307 -0.3623756
-0.2814126 -0.4872402 1.5674443 -0.2723291 0.5980871 0.0588736 1.0346812 -0.7237666 -0.6373311 0.1706618 1.2676838 0.4406159 -0.0550427 -0.3188837
-0.3515035 -0.4872402 1.5674443 -0.2723291 0.5980871 0.1243431 1.0417863 -0.6969823 -0.6373311 0.1706618 1.2676838 0.3185937 -0.2146828 0.0507979
-0.3817574 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.6584445 0.9529728 -0.6293092 -0.6373311 0.1706618 1.2676838 0.3506875 0.3328548 -0.4493595
-0.3066139 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.7509558 1.0595490 -0.6881492 -0.6373311 0.1706618 1.2676838 -1.0286891 0.6521351 -0.7538032
-0.3552552 -0.4872402 1.5674443 -0.2723291 0.5980871 0.0716829 1.0524439 -0.7998930 -0.6373311 0.1706618 1.2676838 0.4161895 0.6031228 -0.4819785
-0.3825921 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.4876545 0.8854745 -0.8681835 -0.6373311 0.1706618 1.2676838 0.2363328 0.5947207 -0.5580894
-0.3791404 -0.4872402 1.5674443 -0.2723291 0.5980871 0.2410496 1.0595490 -0.9237941 -0.6373311 0.1706618 1.2676838 0.4097270 0.2712393 -0.5907084
-0.3910604 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.6086307 1.0524439 -1.0098459 -0.6373311 0.1706618 1.2676838 0.3873818 1.2136763 -1.0038819
-0.3567968 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.1901952 1.0417863 -1.0097984 -0.6373311 0.1706618 1.2676838 0.4406159 0.8131756 -0.5145975
-0.3862822 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.1574604 0.8890270 -1.0367727 -0.6373311 0.1706618 1.2676838 0.3440059 1.6113762 -0.9277710
-0.2307590 -0.4872402 1.5674443 -0.2723291 0.5980871 -1.8013144 1.1163897 -1.1186928 -0.6373311 0.1706618 1.2676838 0.4406159 3.0467371 -0.8842790
-0.0340024 -0.4872402 1.2307270 3.6647712 2.7296452 -1.2547863 1.1163897 -1.1746359 -0.5224844 -0.0310742 -1.7347012 0.4406159 1.9838699 -0.9930089
0.0562546 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.1622751 1.1163897 -1.1318000 -0.5224844 -0.0310742 -1.7347012 0.4406159 1.9278558 -0.7538032
-0.0969342 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.9664114 1.0382338 -1.1630958 -0.5224844 -0.0310742 -1.7347012 0.4406159 2.3297568 -1.1669767
-0.1434839 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.2200834 1.1163897 -1.1283332 -0.5224844 -0.0310742 -1.7347012 -2.0128627 2.1211044 -0.9495170
-0.1695595 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.9345550 1.1163897 -1.0820306 -0.5224844 -0.0310742 -1.7347012 -2.0527336 0.5597119 -0.7538032
-0.1447302 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.9336767 0.9636304 -1.1085299 -0.5224844 -0.0310742 -1.7347012 0.3837671 2.3633653 -0.8625331
-0.1491050 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.5636316 0.8961321 -1.0758569 -0.5224844 -0.0310742 -1.7347012 0.0034610 2.1939227 -0.5145975
-0.1022553 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.9786758 0.9352101 -1.0777090 -0.5224844 -0.0310742 -1.7347012 -0.0528401 1.2318808 -0.7755492
-0.2275084 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.2314694 1.0204711 -1.0338758 -0.5224844 -0.0310742 -1.7347012 0.1766361 0.2026221 -0.1122969
-0.2461422 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.2533631 1.1163897 -1.0464131 -0.5224844 -0.0310742 -1.7347012 -0.1651137 0.0877932 -0.3188837
-0.2891275 -0.4872402 1.2307270 3.6647712 2.7296452 -1.8112772 0.6900847 -1.0375800 -0.5224844 -0.0310742 -1.7347012 -0.1467118 -0.0746476 -0.7864221
-0.1702419 -0.4872402 1.2307270 -0.2723291 2.7296452 -0.8192718 1.0631016 -1.0314063 -0.5224844 -0.0310742 -1.7347012 -1.0375614 0.4392816 -0.3406296
-0.2557300 -0.4872402 1.2307270 3.6647712 2.7296452 -0.2215067 0.9742880 -0.9714740 -0.5224844 -0.0310742 -1.7347012 -0.3905371 0.3454580 -0.6015814
-0.0091278 -0.4872402 1.2307270 3.6647712 2.7296452 -0.1887719 0.4982475 -0.9733261 -0.5224844 -0.0310742 -1.7347012 -2.9428165 0.3314545 -0.7538032
-0.1356551 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.4412321 0.9032372 -0.9776477 -0.5224844 -0.0310742 -1.7347012 -2.9360253 0.4882939 -1.0256279
-0.2778505 -0.4872402 1.2307270 -0.2723291 0.4341211 0.9370190 1.0240236 -0.9107344 -0.5224844 -0.0310742 -1.7347012 0.0740016 -1.1291127 2.0405547
-0.2639855 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.3111714 1.1163897 -0.9677223 -0.5224844 -0.0310742 -1.7347012 -0.0304949 -0.8714479 0.1921468
-0.2544314 -0.4872402 1.2307270 -0.2723291 2.7296452 0.3207517 1.1163897 -0.9636382 -0.5224844 -0.0310742 -1.7347012 0.0836407 -0.7370141 0.0834169
-0.2720515 -0.4872402 1.2307270 3.6647712 0.4341211 -0.0492934 0.8535016 -0.9482040 -0.5224844 -0.0310742 -1.7347012 -0.1944691 -1.0016807 0.4857174
-0.2499741 -0.4872402 1.2307270 -0.2723291 0.4341211 1.7141136 0.7895559 -0.8662839 -0.5224844 -0.0310742 -1.7347012 0.1944903 -1.5296134 2.9865046
-0.2069109 -0.4872402 1.2307270 3.6647712 0.4341211 2.1595909 1.0524439 -0.8331359 -0.5224844 -0.0310742 -1.7347012 0.3607647 -1.5030067 2.9865046
-0.2435032 -0.4872402 1.2307270 3.6647712 0.4341211 2.9751133 0.8996847 -0.7755306 -0.5224844 -0.0310742 -1.7347012 0.3480587 -1.3069574 2.9865046
-0.1594090 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.6129005 0.8250813 -0.6520568 -0.5224844 -0.0310742 -1.7347012 0.4210091 -0.1418645 0.0181789
-0.0801628 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.2613577 0.8677118 -0.7178779 -0.5224844 -0.0310742 -1.7347012 -1.2762386 -0.3981289 0.2682577
-0.1864006 -0.4872402 1.2307270 -0.2723291 0.4341211 2.3403437 0.9813931 -0.8306664 -0.5224844 -0.0310742 -1.7347012 0.1382988 -1.2537440 2.9865046
-0.2108044 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5801657 0.3774611 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -1.4137053 -0.0718469 0.1377818
-0.1526614 -0.4872402 1.2307270 -0.2723291 0.4341211 0.0489109 0.9778406 -0.8049744 -0.5224844 -0.0310742 -1.7347012 -0.6526548 -0.2174835 0.1377818
-0.1353238 -0.4872402 1.2307270 -0.2723291 0.4341211 0.1670406 0.9458677 -0.7278033 -0.5224844 -0.0310742 -1.7347012 -0.2917364 -0.1866758 -0.0253130
-0.2797292 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5830122 0.9245525 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -0.7052317 0.2488337 -0.5580894
-0.1510919 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5758960 1.0204711 -0.6678710 -0.5224844 -0.0310742 -1.7347012 -0.0935872 -0.0872508 -0.3732486
-0.4039255 -0.4872402 -1.0330050 -0.2723291 -0.3857090 -1.0142570 0.7078474 -0.5693769 -0.5224844 -0.6659492 -0.8570810 0.4406159 0.2852429 0.0616709
-0.4094315 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.1869661 0.5515356 -0.5455370 -0.5224844 -0.6659492 -0.8570810 0.4252810 -0.5059560 0.1160358
-0.4102814 -0.4872402 -1.0330050 -0.2723291 -0.3857090 -0.6057842 0.0044442 -0.5191326 -0.5224844 -0.6659492 -0.8570810 0.4004165 -0.4219349 0.0073060
-0.4123542 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.3719887 -1.2602606 -0.3147360 -0.5224844 -0.6659492 -0.8570810 0.3755520 -1.0254866 0.7466691
-0.4119380 -0.4872402 -1.0330050 -0.2723291 -0.3857090 -0.3766409 -0.7593522 -0.1140436 -0.5224844 -0.6659492 -0.8570810 0.4004165 -0.3561184 0.0725439
-0.4137947 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.0432179 0.1714136 -0.2267846 -0.5224844 -0.6659492 -0.8570810 0.4263763 -0.8910529 0.2247657
-0.4123798 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.8188892 0.2069391 -0.4177891 -0.5224844 -0.6659492 -0.8570810 0.3789476 -0.8028307 0.8010341
-0.4133820 -0.4872402 -1.2647715 -0.2723291 -0.5755644 0.9896793 -0.3614676 -0.4587729 -0.7521778 -1.2770905 -0.3027945 0.4406159 -1.0660969 1.5947622
-0.4124426 -0.4872402 -1.2647715 -0.2723291 -0.5755644 2.1069307 0.5231153 -0.5005640 -0.7521778 -1.2770905 -0.3027945 0.4259382 -0.7132081 1.8774599
-0.4120938 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.2001579 -0.2264710 -0.5685221 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.4485416 1.4860323
-0.4095187 -0.4872402 -1.2647715 -0.2723291 -0.5755644 1.2387480 0.8392915 -0.5197499 -0.7521778 -1.2770905 -0.3027945 0.4101651 -1.0969046 1.6708731
-0.4084666 -0.4872402 -1.2647715 -0.2723291 -0.5755644 0.3961839 0.9600779 -0.4502247 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.9764743 1.0837317
-0.4104430 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.9687130 0.7540305 -0.3833114 -0.7521778 -1.2770905 -0.3027945 0.3759901 0.1858179 0.4204795
-0.4130715 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.1873487 0.0079967 -0.2447358 -0.7521778 -1.2770905 -0.3027945 0.3333809 0.0695886 0.7684151
-0.4135889 -0.4872402 -1.2647715 -0.2723291 -0.5755644 2.2008652 -0.5319896 -0.2829652 -0.7521778 -1.2770905 -0.3027945 0.3938444 -1.1487176 2.9865046
-0.4109463 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.7078757 -0.9760573 -0.0030596 -0.5224844 -0.0607412 -1.5037485 0.4074267 -0.8364391 1.0293668
-0.4054776 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.3862212 -1.4023623 0.3664594 -0.5224844 -0.0607412 -1.5037485 0.2866094 -1.1333138 0.7901611
-0.4103709 1.4422310 -1.1219217 -0.2723291 -1.0156836 1.2814456 -1.0542132 0.3664594 -0.5224844 -0.0607412 -1.5037485 0.4406159 -1.0170845 1.3446835
-0.4095594 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.9484050 -1.6723554 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.2300893 -1.0576947 1.5730162
-0.4120671 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.6466760 -1.3419691 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3618601 -1.1151092 0.8662720
-0.4100291 1.4422310 -1.1219217 -0.2723291 -1.0156836 1.2714828 -1.5018334 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3704038 -1.3699732 1.5077783
-0.4175591 2.0853880 -1.1962619 -0.2723291 -1.3263561 0.7334942 -2.0844502 1.1514203 -0.9818712 -0.8498849 -1.3189864 0.4019500 -1.0674972 0.9315099
-0.4184287 2.0853880 -1.1962619 -0.2723291 -1.3263561 0.4545372 -1.7682741 1.1514203 -0.9818712 -0.8498849 -1.3189864 0.2193548 -1.1585201 0.7140502
-0.4184962 2.9429307 -1.5563017 -0.2723291 -1.1451305 2.2634882 -1.2993386 0.8801579 -0.6373311 -0.9092190 -1.8732728 0.4113700 -1.3559697 2.9865046
-0.4154386 2.9429307 -1.4017907 -0.2723291 -1.3004667 1.4266171 -1.2247352 1.6687754 -0.8670245 -0.4701466 -2.7047025 0.4406159 -1.2005307 1.1707156
-0.4146771 2.9429307 -1.4017907 -0.2723291 -1.3004667 1.1704320 -1.1359217 1.6687754 -0.8670245 -0.4701466 -2.7047025 -0.0258945 -0.5661712 0.8445260
-0.4157211 2.9429307 -1.4017907 -0.2723291 -1.3004667 1.4081148 -1.0755284 1.6687754 -0.8670245 -0.4701466 -2.7047025 0.3891344 -0.8448412 1.3120645
-0.4164395 3.5860878 -1.4090789 -0.2723291 -1.3090965 0.9825630 -1.8926130 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.4406159 -1.1333138 1.3446835
-0.4180346 3.5860878 -1.4090789 -0.2723291 -1.3090965 1.2102830 -1.9423486 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.3026016 -1.1487176 1.1272237
-0.4160966 3.0501236 -1.3274505 -0.2723291 -1.2055390 -0.1745394 -1.0719759 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4063314 -0.7314127 0.1704008
-0.4175707 3.0501236 -1.3274505 -0.2723291 -1.2055390 1.8863269 -1.8784028 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4239665 -1.3363648 2.1492845
-0.4160210 3.5860878 -1.2327031 -0.2723291 -1.1960462 2.2321767 -1.2567081 0.6282713 -0.6373311 -1.0931548 -1.7347012 0.3954874 -1.2383402 2.8234098
-0.4177661 3.5860878 -1.2327031 -0.2723291 -1.1960462 2.4897850 -1.3028911 0.6282713 -0.6373311 -1.0931548 -1.7347012 0.3710611 -1.3685729 2.9865046
-0.4042417 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.5602402 -1.6439351 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4406159 -0.2496916 0.0073060
-0.3933983 -0.4872402 -0.0797012 -0.2723291 -0.5669346 0.0588736 -0.5710675 0.2658758 -0.6373311 -0.7786840 0.0667298 0.4183803 -0.2356881 0.2030197
-0.3908058 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.7139512 0.1465458 0.2658758 -0.6373311 -0.7786840 0.0667298 0.3587931 0.7571615 -0.0035670
-0.4043057 -0.4872402 -0.0797012 3.6647712 -0.5669346 -0.3140179 -0.3365998 0.2109299 -0.6373311 -0.7786840 0.0667298 0.2699601 0.2810418 0.2030197
-0.3694468 -0.4872402 -0.0797012 3.6647712 -0.5669346 -1.3387581 1.1163897 0.0379717 -0.6373311 -0.7786840 0.0667298 0.4406159 1.4615386 -0.2753917
-0.3998193 -0.4872402 -0.0797012 3.6647712 -0.5669346 -0.4620360 0.8357389 0.0389689 -0.6373311 -0.7786840 0.0667298 0.4006356 0.6465337 -0.0905509
-0.3764142 -0.4872402 -0.0797012 3.6647712 -0.5669346 -1.2533631 0.7114000 -0.0617572 -0.6373311 -0.7786840 0.0667298 0.4224331 1.5861699 -0.3515026
-0.3948516 -0.4872402 -0.0797012 3.6647712 -0.5669346 -0.6797932 -0.5248845 -0.0676459 -0.6373311 -0.7786840 0.0667298 0.3753329 0.4728900 -0.0144400
-0.4037651 -0.4872402 -0.0797012 -0.2723291 -0.5669346 0.1286129 -1.2886809 0.0714046 -0.6373311 -0.7786840 0.0667298 0.3191414 -0.4583441 0.6053203
-0.3864391 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -1.2419771 -2.0880028 -0.0985619 -0.6373311 -0.7786840 0.0667298 -0.0848244 2.3661660 0.1269088
-0.3970802 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.1460744 -0.9298742 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4047979 -0.4457409 0.2682577
-0.4148003 -0.4872402 0.4013236 3.6647712 -0.0405174 -0.5645100 -0.4467286 -0.3243289 -0.5224844 -0.7846174 -0.9494620 0.3957065 0.1200013 0.0834169
-0.4119485 -0.4872402 0.4013236 -0.2723291 -0.0405174 0.5086207 0.5870610 -0.1775851 -0.5224844 -0.7846174 -0.9494620 0.3954874 -0.4149332 0.6705582
-0.4072331 -0.4872402 0.4013236 3.6647712 -0.0405174 -0.4748452 0.8961321 -0.4301365 -0.5224844 -0.7846174 -0.9494620 0.4406159 0.7375566 -0.1122969
-0.4068192 -0.4872402 0.4013236 3.6647712 -0.0405174 0.1257664 0.8463965 -0.2050342 -0.5224844 -0.7846174 -0.9494620 0.4060028 -0.3015046 0.0507979
-0.3784708 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.9484050 0.7078474 -0.4432437 -0.1779443 -0.6006817 -0.4875567 0.3836576 -0.4121325 0.4530985
-0.3727021 -0.4872402 -0.7196100 3.6647712 -0.4115983 -0.1716929 0.8073186 -0.3547700 -0.1779443 -0.6006817 -0.4875567 0.4224331 1.2332812 -0.0905509
-0.3476077 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.8459310 0.3241729 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.3693085 -0.3813247 0.5400824
-0.3486378 -0.4872402 -0.7196100 -0.2723291 -0.4115983 0.4744627 0.4343017 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.4406159 -0.7076067 0.8227800
-0.3834420 -0.4872402 -0.7196100 -0.2723291 -0.4374877 2.8199790 0.3454882 -0.4277145 -0.1779443 -0.6006817 -0.4875567 0.3108167 -1.1921285 2.4211092
-0.3588418 -0.4872402 -0.7196100 -0.2723291 -0.4374877 3.4732509 0.5124576 -0.4277145 -0.1779443 -0.6006817 -0.4875567 0.2774085 -1.1235113 2.9865046
-0.3756748 -0.4872402 -0.7196100 -0.2723291 -0.4374877 2.4983245 0.6367966 -0.2751294 -0.1779443 -0.6006817 -0.4875567 0.3363384 -1.3335641 1.6382541
-0.3721591 -0.4872402 -0.7196100 -0.2723291 -0.4374877 1.2501340 0.4023288 -0.2751294 -0.1779443 -0.6006817 -0.4875567 0.1687496 -0.8812504 0.9858749
-0.3854347 -0.4872402 -0.7196100 -0.2723291 -0.4374877 1.9944939 -1.8322198 -0.1994304 -0.1779443 -0.6006817 -0.4875567 0.2282272 -1.2229363 2.6276960
-0.3687411 -0.4872402 -0.7196100 -0.2723291 -0.4374877 0.3805282 -1.6759080 -0.1994304 -0.1779443 -0.6006817 -0.4875567 0.2592256 -1.2453419 0.9750019
-0.3576710 -0.4872402 -0.7196100 -0.2723291 -0.4374877 -0.4321477 -0.0168711 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2374281 -0.1404642 0.1921468
-0.3662788 -0.4872402 -0.7196100 -0.2723291 -0.4374877 1.6045233 0.2957526 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2132208 -1.0366895 0.9967478
-0.3532195 -0.4872402 -0.7196100 -0.2723291 -0.4115983 2.9210298 0.1678611 0.0205904 -0.1779443 -0.6006817 -0.4875567 0.3202367 -1.4259873 2.0840466
-0.3815656 -0.4872402 -0.7196100 -0.2723291 -0.4115983 2.7929373 0.0648374 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2440002 -1.2187352 2.8016638
-0.3680285 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.6281737 -0.0737117 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.0386218 -0.6445909 0.7031772
-0.3816842 -0.4872402 -0.7196100 -0.2723291 -0.4115983 -0.2827064 -0.2513388 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2199025 -0.2482913 0.1595278
-0.3595800 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.4929649 0.2815424 0.1676191 -0.1779443 -0.6006817 -0.4875567 0.3480587 -0.4359384 0.2791307
-0.3605973 -0.4872402 -0.7196100 -0.2723291 -0.4115983 1.5276678 0.1074679 0.1676191 -0.1779443 -0.6006817 -0.4875567 0.3658034 -1.1095078 0.9750019
-0.4105174 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.2794774 -1.7789317 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2490389 -0.8812504 0.1269088
-0.4093455 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.4573837 -0.9369793 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2969057 -0.7398148 0.0834169
-0.4069308 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.8715495 -0.5071218 1.2067460 -0.4076377 -0.6422155 -0.8570810 0.3787285 -0.1782737 -0.0579320
-0.4077644 0.7990739 -0.9047317 -0.2723291 -1.0933517 -0.2698972 -0.1234473 1.2067460 -0.4076377 -0.6422155 -0.8570810 0.4156419 -0.0354378 -0.2645187
-0.4081387 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.1044176 -0.5568574 1.5388905 -0.4076377 -0.6422155 -0.8570810 0.1760884 -0.2006793 -0.0361860
-0.4052706 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.1542314 -2.1590536 1.5388905 -0.4076377 -0.6422155 -0.8570810 0.1975573 -1.0450916 0.1269088
-0.3961432 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9843688 0.2815424 1.9755128 -0.2927910 -0.4642132 0.2976825 0.1732405 -0.0214342 -0.5363434
-0.3978580 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9672898 0.0577323 1.9755128 -0.2927910 -0.4642132 0.2976825 0.3555071 0.8131756 -0.4384865
-0.3805937 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.2513949 -1.1963149 2.0232877 -0.2927910 -0.4642132 0.2976825 0.3670082 -0.4891518 0.1921468
-0.3972488 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.0834514 0.3774611 2.0232877 -0.2927910 -0.4642132 0.2976825 0.2132208 -0.3505170 -0.2210268
-0.4009900 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2111614 -0.6918540 1.9145357 -0.2927910 -0.4642132 0.2976825 0.1975573 -0.4387391 0.2138927
-0.3979278 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.6167877 -1.8144571 1.9145357 -0.2927910 -0.4642132 0.2976825 0.4060028 -0.8532433 0.3987335
-0.4037907 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2880169 -1.9743215 1.7104241 -0.2927910 -0.4642132 0.2976825 0.4338247 -0.9456666 0.2030197
-0.3952120 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2182776 -2.1199757 1.7104241 -0.2927910 -0.4642132 0.2976825 0.2234076 -1.2691479 0.2465117
-0.4105441 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.9569445 -2.1945790 2.4275218 -0.2927910 -0.4642132 0.2976825 0.3222084 -1.2775500 0.7684151
-0.3772094 0.4560568 -0.7691701 -0.2723291 -1.0674624 2.8100163 -2.1377384 2.4275218 -0.2927910 -0.4642132 0.2976825 0.4406159 -1.2761497 2.2036495
-0.4144992 2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.2513949 -1.2993386 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.3966923 -0.8518430 -0.0688050
-0.4159768 2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.5815890 -1.7576164 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.4217758 -0.4765487 -0.1775348
-0.4183136 3.3717021 -1.0767345 -0.2723291 -1.3867646 1.6642999 -1.2211827 1.2067460 -0.7521778 -0.9744866 -1.1804147 0.3249467 -1.3363648 2.3341253
-0.3490052 0.3703025 -1.0446662 -0.2723291 0.7965722 3.4433626 0.6510068 -0.9469692 -0.5224844 -0.8558183 -2.5199404 0.3617506 -1.0548940 2.9865046
-0.3429632 0.3703025 -1.0446662 -0.2723291 0.7965722 1.4920866 1.1163897 -0.9025187 -0.5224844 -0.8558183 -2.5199404 0.2915385 -0.6810000 1.4642863
-0.3437607 0.3703025 -1.0446662 -0.2723291 0.7965722 0.7932707 1.1163897 -0.8473829 -0.5224844 -0.8558183 -2.5199404 0.3861769 -0.8056314 0.8227800
-0.3573095 0.3703025 -1.0446662 -0.2723291 0.7965722 1.3070641 0.4698271 -0.7992281 -0.5224844 -0.8558183 -2.5199404 0.3957065 -0.4289367 1.2250806
-0.3580059 0.3703025 -1.0446662 -0.2723291 0.7965722 1.7582344 0.7398203 -0.7860734 -0.5224844 -0.8558183 -2.5199404 0.3471824 -0.7552187 2.2362684
-0.3596311 0.3703025 -1.0446662 -0.2723291 0.7965722 3.0078481 0.8144237 -0.7154559 -0.5224844 -0.8558183 -2.5199404 0.3306426 -0.9442662 2.8560287
-0.3241585 0.3703025 -1.0446662 -0.2723291 0.7965722 1.4835471 0.9209999 -0.8150422 -0.5224844 -0.8558183 -2.5199404 0.4024977 -0.1964782 0.9206369
-0.3561515 0.3703025 -1.0446662 -0.2723291 0.7965722 1.3113338 0.8179762 -0.8856597 -0.5224844 -0.8558183 -2.5199404 0.3419247 -0.6375891 1.5186513
-0.3315571 0.3703025 -1.0446662 -0.2723291 0.7965722 -1.0313360 -0.2051558 -0.8588754 -0.5224844 -0.8558183 -2.5199404 0.3913251 -0.3085064 0.0290519
-0.3287576 0.3703025 -1.0446662 -0.2723291 0.7965722 1.0380698 0.5692983 -0.7893502 -0.5224844 -0.8558183 -2.5199404 0.3000823 0.2992464 0.8880180
-0.3528649 0.3703025 -1.0446662 -0.2723291 0.1752274 2.8640998 -0.0559490 -0.6522468 -0.5224844 -0.8558183 -2.5199404 0.3052304 -0.7300124 2.9865046
-0.3572641 0.3703025 -1.0446662 -0.2723291 0.1752274 1.6870719 -0.5675150 -0.4383522 -0.5224844 -0.8558183 -2.5199404 0.3683227 -1.3293630 2.2797604
-0.4095629 0.3703025 -0.6088285 3.6647712 -0.7826793 -0.5189660 -0.2513388 0.0581549 -0.7521778 -1.0990882 0.0667298 0.3797143 0.1396062 -0.1992808
-0.3853219 0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.6100540 -0.9405319 0.3010658 -0.7521778 -1.0990882 0.0667298 0.3502494 0.0485834 -0.1557889
-0.4012551 0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.0635259 -1.8570876 0.3010658 -0.7521778 -1.0990882 0.0667298 0.4406159 -0.8490423 0.2900036
-0.4067785 0.3703025 -0.6088285 -0.2723291 -0.7826793 0.3606027 -0.3508100 0.0581549 -0.7521778 -1.0990882 0.0667298 0.4193661 -0.6894022 0.2030197
-0.3943063 0.3703025 -0.6088285 3.6647712 -0.7826793 2.0016102 -0.5959353 0.2713846 -0.7521778 -1.0990882 0.0667298 0.3734708 -0.8504426 1.3773024
-0.4135401 1.2278453 -0.6889993 3.6647712 -0.9293857 0.6737177 -1.2673657 0.1341862 -0.6373311 -0.9151524 -0.3951756 0.4406159 -1.2775500 1.0728588
-0.4089362 1.2278453 -0.6889993 -0.2723291 -0.9293857 0.8103497 -0.9156641 0.2242746 -0.6373311 -0.9151524 -0.3951756 0.4406159 -1.3545694 1.0293668
-0.4079306 1.2278453 -0.6889993 3.6647712 -0.9293857 1.3981521 -0.6954065 0.4711747 -0.6373311 -0.9151524 -0.3951756 0.3568215 -0.9246613 1.1598427
-0.4129785 1.2278453 -0.6889993 3.6647712 -0.9293857 0.7704987 -1.4556504 0.5070771 -0.6373311 -0.9151524 -0.3951756 0.4028263 -1.1893278 1.1489697
-0.4108266 1.2278453 -0.6889993 -0.2723291 -0.9293857 0.2809007 -1.2957860 0.1639624 -0.6373311 -0.9151524 -0.3951756 0.4406159 -0.7650212 0.7140502
-0.3956433 0.3703025 -1.1379558 -0.2723291 -0.9647679 0.7505732 -1.2922335 0.1451564 -0.5224844 -1.1406221 -1.6423201 0.4406159 -1.0927035 1.3664294
-0.4159420 0.3703025 -1.1379558 -0.2723291 -0.9647679 2.1852094 -0.1447626 0.4272465 -0.5224844 -1.1406221 -1.6423201 0.3355717 -1.2453419 2.4863472
-0.4157943 0.3703025 -1.1379558 -0.2723291 -0.9647679 0.9726003 -1.1146064 0.6884411 -0.5224844 -1.1406221 -1.6423201 0.3894630 -1.1291127 1.3990484
-0.4129762 0.3703025 -1.1379558 3.6647712 -0.9647679 1.9361406 -0.6705387 0.6728644 -0.5224844 -1.1406221 -1.6423201 0.2234076 -1.3503683 2.5515851
-0.4183566 3.3717021 -1.4469778 3.6647712 -1.3263561 2.3318042 -1.5551216 0.9925190 -0.9818712 -1.2474235 -2.2427971 0.4255000 -1.3293630 2.9865046
-0.4190484 3.3717021 -1.1904313 -0.2723291 -1.3349859 1.1433903 -1.6972232 1.6679681 -0.9818712 -0.7312167 -1.4575580 0.4167372 -0.6725979 1.0511128
-0.4188275 1.8710023 -1.2953821 -0.2723291 -1.4299136 0.2396264 -1.3028911 1.6679681 -0.9818712 -0.6422155 -1.4575580 0.4167372 -0.6193846 -0.0579320
-0.4178172 2.9429307 -1.3668070 -0.2723291 -1.4644327 -0.0777584 -1.3171013 2.5141909 -0.9818712 -0.9922868 -0.1180323 -0.1651137 0.0387809 -0.2645187
-0.4156013 1.7638095 -0.8478833 -0.2723291 -1.2918369 -0.1076467 -1.3242064 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.4406159 -0.7720229 0.0725439
-0.4147654 1.7638095 -0.8478833 -0.2723291 -1.2918369 0.0432179 -0.8161929 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.4406159 -0.7076067 -0.0253130
-0.4151061 1.7638095 -0.8478833 -0.2723291 -1.2918369 0.3990304 -1.6226198 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.1648063 -0.4401395 0.2465117
-0.4160303 2.9429307 -0.9018164 -0.2723291 -1.2400582 0.8203125 -1.4449928 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.3055571 0.6488122
-0.4109336 2.9429307 -0.9018164 -0.2723291 -1.2400582 1.2287853 -1.4520979 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.2733490 1.6056352
-0.4158989 2.9429307 -0.9018164 -0.2723291 -1.2400582 0.4915417 -1.6048571 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.1137088 0.5835743
-0.4104929 -0.4872402 0.4056965 -0.2723291 -1.0156836 -0.2243532 -1.7824842 0.8109650 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.5703722 0.1486548
-0.4105697 -0.4872402 0.4056965 -0.2723291 -1.0156836 -0.3922967 -0.9334268 0.8109650 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.3155082 -0.0905509
-0.4050672 -0.4872402 0.4056965 -0.2723291 -1.0156836 0.5598577 -1.3313114 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.8938536 0.6596852
-0.4138563 -0.4872402 0.4056965 -0.2723291 -1.0156836 0.3762584 -0.6243557 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.3962541 -0.7370141 0.4965904
-0.4037058 -0.4872402 0.4056965 -0.2723291 -1.0156836 -0.7039885 -0.3756778 1.1991001 -0.6373311 -0.7074831 -1.1342242 0.4406159 0.4462833 -0.2427728
-0.4125844 2.5141594 -1.2968398 -0.2723291 -1.3349859 0.0859154 -1.7220910 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1266881 -1.0758993 -0.0035670
-0.4136366 2.5141594 -1.2968398 -0.2723291 -1.3349859 1.0764975 -2.0808977 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1632728 -1.1081074 0.7031772
-0.4149666 2.5141594 -1.2968398 -0.2723291 -1.3349859 0.8345450 -0.7522472 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.3744566 -0.9218606 0.2465117
-0.4159896 0.9705825 -0.7356442 -0.2723291 -1.0502028 0.4346117 -1.0009251 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.4280193 -0.4415399 -0.0579320
-0.4093292 0.9705825 -0.7356442 -0.2723291 -1.0502028 0.2994029 -1.7824842 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.2950436 -0.5577691 0.4204795
-0.4084759 0.9705825 -0.7356442 -0.2723291 -1.0502028 0.9925258 -1.8073520 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.3697466 -1.0913032 1.1489697
-0.4136900 0.9277053 -1.3055857 -0.2723291 -0.7136410 1.3540313 -0.9760573 0.1077818 -0.2927910 -1.1050216 -0.0256513 0.4053456 -0.8014303 1.4751593
-0.4137319 0.9277053 -1.3055857 -0.2723291 -0.7136410 0.4716162 -0.3721252 -0.2018524 -0.2927910 -1.1050216 -0.0256513 0.4018404 -0.5213599 0.6379392
-0.4113788 0.9277053 -1.3055857 -0.2723291 -0.7136410 1.6159094 0.1181255 -0.3304551 -0.2927910 -1.1050216 -0.0256513 0.4406159 -0.8658465 1.1815886
-0.4143678 0.9277053 -1.3055857 -0.2723291 -0.7136410 0.8032335 0.0612849 -0.2908010 -0.2927910 -1.1050216 -0.0256513 0.4406159 -0.7174092 0.6161933
-0.3627887 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4986579 0.4946949 -0.2267846 -0.6373311 -0.6184819 -0.0256513 0.4406159 -1.1361145 0.0290519
-0.3794811 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.4449570 0.2886475 -0.3288880 -0.6373311 -0.6184819 -0.0256513 0.4333866 -0.3757233 -0.2427728
-0.1137056 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -1.8667839 -1.0932912 -0.6058017 -0.6373311 -0.6184819 -0.0256513 -0.0681750 -0.0018293 -0.6994382
-0.3282101 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2314694 -0.5604099 -0.5483863 -0.6373311 -0.6184819 -0.0256513 0.4406159 -0.9344638 -0.0470590
-0.3896781 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.3723712 0.7753457 -0.4563984 -0.6373311 -0.6184819 -0.0256513 0.4340438 -0.1306617 -0.3406296
-0.3887841 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.0265214 0.5053525 -0.2527616 -0.6373311 -0.6184819 -0.0256513 0.4021690 -0.6655962 -0.1014239
-0.3771792 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4018769 0.6652169 -0.0915333 -0.6373311 -0.6184819 -0.0256513 0.4273621 -0.4723476 0.1377818
-0.3906233 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.8249648 0.3241729 0.0712146 -0.6373311 -0.6184819 -0.0256513 0.4353582 -0.1614694 -0.6885653
-0.3831002 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.5275055 0.5195627 0.0966692 -0.6373311 -0.6184819 -0.0256513 0.3727041 0.7949710 -0.5145975
-0.3915928 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.7153745 0.1110204 0.1123884 -0.6373311 -0.6184819 -0.0256513 0.4406159 0.4602869 -0.2971377
-0.3733636 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.1385756 -0.0488439 -0.1246813 -0.6373311 -0.6184819 -0.0256513 0.4221044 -0.3211096 0.0616709
-0.3648244 -0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2442787 -0.3472574 0.0982364 -0.6373311 -0.6184819 -0.0256513 0.4332770 0.0107739 -0.1666618
-0.4006168 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.2011986 -0.5781726 0.3539696 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.7636208 0.1377818
-0.3989904 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.1300361 -0.5071218 0.3539696 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.8098324 0.0616709
-0.3792788 -0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.3467527 -0.6634336 0.4397839 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.6936032 -0.2318998
-0.3870937 -0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.8206950 0.2033865 0.4397839 -0.5224844 -0.7193499 0.5286352 0.3774141 -0.1278610 -0.4384865
-0.3804472 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.1855429 -1.0115827 0.4397839 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.9148588 0.2682577
-0.3977964 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.2083149 -1.9139283 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4053456 -1.0604955 0.2247657
-0.3848208 -0.4872402 -0.5476072 -0.2723291 -0.5324154 0.0389481 -1.4094674 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.9106578 0.0507979
-0.3920800 -0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.2869762 -0.8836912 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 0.0191760 -0.0361860
-0.4124089 -0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.5929750 -1.5195961 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2822280 -0.3757233 -0.3515026
-0.4122845 -0.4872402 -1.1510747 -0.2723291 -0.8171985 0.0688364 -1.8251147 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2030341 -0.7440159 0.0073060
-0.4148189 -0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.2001579 -1.2922335 0.9871052 -0.6373311 0.1291279 -0.7185093 0.1303027 -0.4989543 -0.2971377
-0.4142620 1.0134596 -0.7400171 -0.2723291 -1.0079168 -0.8235415 -1.4272301 1.3514003 -0.9818712 -0.6184819 -0.7185093 0.4090698 -0.0312367 -0.5907084
-0.4160722 1.0134596 -0.7400171 -0.2723291 -1.0079168 -0.3609852 -1.6084097 1.3514003 -0.9818712 -0.6184819 -0.7185093 0.0610765 -0.6753986 -0.3406296
-0.4141923 -0.4872402 -0.8668328 -0.2723291 -0.3425600 0.0446411 -1.0826335 1.2648262 -0.5224844 -1.0931548 0.8057784 0.3618601 -0.9764743 -0.0361860
-0.4157560 -0.4872402 -0.8668328 -0.2723291 -0.3425600 0.0361016 -1.0684234 1.2648262 -0.5224844 -1.0931548 0.8057784 0.3584645 -0.8266367 -0.1992808
-0.4154967 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.3524457 -1.2105250 1.0401514 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.6501923 -0.1557889
-0.4161175 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5915517 -0.7913251 0.6819824 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.3995293 -0.3297567
-0.4165663 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5545472 -0.3188371 0.8642962 -0.5224844 -1.0931548 0.8057784 0.4177230 -0.2931025 -0.4384865
-0.4162582 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.3211342 -1.1110539 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4322912 -0.5801747 -0.2101538
-0.4137110 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4264547 -0.8232980 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4079314 -0.3841216
-0.4129506 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4506500 -0.3579151 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4709472 -0.4167406
-0.4185892 1.0134596 -1.4017907 -0.2723291 -0.9725347 1.3611476 -0.6847489 1.5400303 -0.9818712 -0.7371501 -1.3651769 0.4169563 -1.0030810 1.1054777
-0.4171976 -0.4872402 -1.3478576 -0.2723291 -0.3166707 0.3634492 -0.3152846 1.1738830 -0.9818712 0.0816606 -1.1804147 0.3645985 -0.5605698 -0.6559463
-0.4171452 1.8710023 -1.0723615 -0.2723291 -0.6100835 0.5854762 -0.4325184 0.9199069 -0.5224844 -0.2268768 -0.3951756 0.4406159 -0.7664215 0.1486548
-0.4165570 1.8710023 -1.0723615 -0.2723291 -0.6100835 0.8388147 -1.4378877 1.2681505 -0.5224844 -0.2268768 -0.3951756 0.3428010 -1.1263120 0.9423829
-0.4164826 -0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.3851804 -0.7131692 2.0033894 -0.7521778 -0.3336782 0.1591109 0.3172793 -0.2973036 -0.5472164
-0.4129379 -0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.5502775 -0.5781726 2.0033894 -0.7521778 -0.3336782 0.1591109 0.0869268 0.0023717 -0.5798354
-0.4179277 3.1573164 -1.0184285 -0.2723291 -1.0847220 0.3292912 -1.4520979 2.2511443 -0.6373311 -0.3396116 -0.2566040 0.3916537 -0.8812504 0.0616709
-0.4183566 2.9429307 -1.3303658 -0.2723291 -1.0329432 0.4986579 -1.3810470 2.1602961 -0.6373311 -0.7608838 -0.6723188 0.3753329 -0.9330634 0.2138927
-0.4167314 1.2278453 -1.4411472 -0.2723291 -1.0847220 0.9313260 -1.2105250 2.3730984 -0.9818712 -0.4345461 0.5748257 0.3633936 -0.9470669 0.4422255
-0.4128809 1.2278453 -1.4411472 -0.2723291 -1.0847220 0.2922867 -0.8588234 2.3730984 -0.9818712 -0.4345461 0.5748257 0.4406159 -0.9344638 0.0399249
-0.4108592 2.0853880 -1.3770106 -0.2723291 -1.2400582 0.4189559 -1.1607894 3.2840500 -0.6373311 0.0163931 -0.0718418 0.1545100 -1.0030810 0.1704008
-0.4116799 2.0853880 -1.3770106 -0.2723291 -1.2400582 -0.5702030 -1.7789317 3.2840500 -0.6373311 0.0163931 -0.0718418 0.3905583 -0.6810000 -0.4276135
-0.4181148 3.3717021 -1.3289081 -0.2723291 -1.2486880 0.6310202 -1.1536844 3.9566022 -0.5224844 -1.3126910 -0.6723188 0.3043541 -1.1417159 0.8227800
-0.4151014 2.9429307 -1.3449423 -0.2723291 -1.2227986 -0.8847413 -1.6581453 3.2248775 -0.6373311 -0.4404796 1.6372081 0.2861713 -0.6445909 -0.4711055
-0.4077097 2.9429307 -1.3449423 -0.2723291 -1.2227986 -0.4961940 -1.7434063 3.2248775 -0.6373311 -0.4404796 1.6372081 0.2121255 -0.9918782 -0.2101538
0.6242409 -0.4872402 1.0149946 3.6647712 1.8580364 -0.1033769 1.0240236 -0.7944316 1.6596029 1.5294129 0.8057784 0.2306369 0.6927453 -0.5145975
0.0274574 -0.4872402 1.0149946 3.6647712 1.8580364 0.1570779 0.7966610 -0.6125452 1.6596029 1.5294129 0.8057784 0.3797143 0.0863929 -0.0905509
0.1846466 -0.4872402 1.0149946 3.6647712 1.8580364 -0.2243532 0.5266678 -0.5092547 1.6596029 1.5294129 0.8057784 0.4245142 -0.1642701 0.0181789
0.0753105 -0.4872402 1.0149946 -0.2723291 1.8580364 -0.2457019 0.4520644 -0.6106931 1.6596029 1.5294129 0.8057784 0.3731422 0.0023717 0.0073060
0.1079337 -0.4872402 1.0149946 -0.2723291 1.8580364 0.1613476 0.6900847 -0.6063715 1.6596029 1.5294129 0.8057784 0.1959143 -0.6810000 0.2682577
0.0259624 -0.4872402 1.0149946 -0.2723291 1.8580364 -0.0478701 0.8002135 -0.7121316 1.6596029 1.5294129 0.8057784 -0.0659843 0.2152253 -0.2862647
0.0075215 -0.4872402 1.0149946 -0.2723291 1.8580364 -1.3131396 0.9813931 -0.8032647 1.6596029 1.5294129 0.8057784 0.2641547 -0.3449156 -0.1884078
0.0707857 -0.4872402 1.0149946 3.6647712 1.8580364 -0.6854862 0.7256101 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0398054 0.2782411 -0.6233273
-0.0161882 -0.4872402 1.0149946 3.6647712 1.4092873 3.5515296 0.5089051 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0232656 -1.0310881 -0.0688050
0.1095555 -0.4872402 1.0149946 -0.2723291 1.4092873 -3.8764132 0.6865322 -1.0361553 1.6596029 1.5294129 0.8057784 -0.0216226 -0.7748236 0.5400824
0.0096990 -0.4872402 1.0149946 -0.2723291 1.4092873 -1.8810164 0.8108711 -0.9700968 1.6596029 1.5294129 0.8057784 -0.4451952 0.1886186 -0.0688050
1.1519647 -0.4872402 1.0149946 -0.2723291 0.6584956 -3.4465917 1.1163897 -1.0848799 1.6596029 1.5294129 0.8057784 -2.4673242 0.0947950 0.0616709
0.1493565 -0.4872402 1.0149946 -0.2723291 0.6584956 -1.8710537 1.1163897 -1.1694595 1.6596029 1.5294129 0.8057784 0.2064297 -1.3153595 2.9865046
0.2390799 -0.4872402 1.0149946 3.6647712 0.6584956 0.5669739 1.0027084 -1.1579669 1.6596029 1.5294129 0.8057784 0.2043485 -1.2495430 2.9865046
0.3400827 -0.4872402 1.0149946 3.6647712 0.6584956 1.0409163 1.0275762 -1.2312439 1.6596029 1.5294129 0.8057784 0.3874913 -1.3573701 2.9865046
0.6532287 -0.4872402 1.0149946 -0.2723291 0.6584956 -0.0976839 1.1163897 -1.2470580 1.6596029 1.5294129 0.8057784 0.1037952 -0.4373388 2.9865046
0.5410338 -0.4872402 1.0149946 3.6647712 0.9777978 -0.5830122 0.7469254 -1.2658165 1.6596029 1.5294129 0.8057784 -0.0963256 -0.5283617 2.9865046
0.8713058 -0.4872402 1.0149946 -0.2723291 0.9777978 -1.9621417 1.1163897 -1.2446360 1.6596029 1.5294129 0.8057784 0.4406159 3.0971497 -0.9495170
1.7304654 -0.4872402 1.0149946 -0.2723291 0.9777978 -3.0551979 1.1163897 -1.2623023 1.6596029 1.5294129 0.8057784 0.4406159 3.5452624 -0.9495170
1.8596166 -0.4872402 1.0149946 -0.2723291 1.0036872 1.4636216 1.0417863 -1.1771529 1.6596029 1.5294129 0.8057784 0.4406159 0.1101988 -0.8190411
1.3572534 -0.4872402 1.0149946 -0.2723291 1.0036872 0.5185834 0.8783694 -1.1635707 1.6596029 1.5294129 0.8057784 0.0695107 1.4825438 -0.9386440
0.7219594 -0.4872402 1.0149946 -0.2723291 1.0036872 0.7249547 1.0737592 -1.1573496 1.6596029 1.5294129 0.8057784 0.4406159 1.2024734 -1.0038819
2.3291951 -0.4872402 1.0149946 -0.2723291 1.0036872 0.1357291 0.9813931 -1.1440049 1.6596029 1.5294129 0.8057784 0.4406159 1.5455597 -1.0256279
1.6570484 -0.4872402 1.0149946 -0.2723291 1.0036872 -0.0877212 1.1163897 -1.1440049 1.6596029 1.5294129 0.8057784 0.4060028 1.2780924 -1.3409445
9.9241096 -0.4872402 1.0149946 -0.2723291 1.0036872 0.9726003 0.8286338 -1.1295680 1.6596029 1.5294129 0.8057784 0.4406159 0.6381316 -1.3191985
1.4254272 -0.4872402 1.0149946 -0.2723291 1.0036872 0.3705654 1.0844168 -1.0807958 1.6596029 1.5294129 0.8057784 0.4406159 1.1800678 -1.2648336
0.6479646 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.0654941 1.1163897 -1.0517320 1.6596029 1.5294129 0.8057784 0.4406159 1.5329565 -1.2213417
0.5090895 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.0882661 1.1163897 -1.0741947 1.6596029 1.5294129 0.8057784 0.4406159 1.6673903 -1.1126118
1.9149323 -0.4872402 1.0149946 -0.2723291 1.2539511 -2.7278503 0.8037660 -1.1186453 1.6596029 1.5294129 0.8057784 -0.7759914 2.5174040 -1.4931663
1.5344076 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.4341159 1.0488914 -1.1250089 1.6596029 1.5294129 0.8057784 0.4406159 2.5426103 -1.6671342
2.4158772 -0.4872402 1.0149946 -0.2723291 1.2539511 -2.3236472 1.1163897 -1.1054906 1.6596029 1.5294129 0.8057784 0.4406159 2.1883213 -1.3083256
2.2069961 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.8283562 0.7433728 -1.0811757 1.6596029 1.5294129 0.8057784 0.4406159 2.7078519 -1.6453882
1.2463082 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.9991462 1.1163897 -1.0474104 1.6596029 1.5294129 0.8057784 0.1779505 2.5160036 -1.3409445
0.5276048 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.2732886 1.0773117 -0.9815894 1.6596029 1.5294129 0.8057784 0.4406159 1.1478597 -1.1995957
0.3893052 -0.4872402 1.0149946 -0.2723291 1.2539511 -0.8135788 1.0098134 -0.8873694 1.6596029 1.5294129 0.8057784 0.4135607 0.6241280 -0.8081681
0.1952587 -0.4872402 1.0149946 -0.2723291 1.2539511 -0.3325202 0.4946949 -0.7727762 1.6596029 1.5294129 0.8057784 0.2377567 0.8551861 0.0725439
0.9259239 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.7771192 1.0098134 -0.9616911 1.6596029 1.5294129 0.8057784 0.4406159 1.8242297 -1.3953095
0.5849224 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.1304187 0.8535016 -0.9516232 1.6596029 1.5294129 0.8057784 0.4406159 0.3524598 -0.9495170
1.1330844 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.5659332 0.9281050 -0.9559448 1.6596029 1.5294129 0.8057784 0.4406159 0.5177013 -1.0691198
0.5932918 -0.4872402 1.0149946 -0.2723291 1.1935426 0.2652449 1.0737592 -0.9827291 1.6596029 1.5294129 0.8057784 0.3867246 0.6255284 -1.0256279
0.2625722 -0.4872402 1.0149946 -0.2723291 1.1935426 0.1713104 0.9742880 -1.0059517 1.6596029 1.5294129 0.8057784 0.4406159 0.9406076 -1.0908658
0.4718334 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.7651883 1.0773117 -1.0265623 1.6596029 1.5294129 0.8057784 0.3989925 1.0176270 -1.5257853
4.0386089 -0.4872402 1.0149946 -0.2723291 1.1935426 -1.1836238 1.1163897 -1.0948528 1.6596029 1.5294129 0.8057784 0.4406159 2.5118026 -1.9063399
0.7327784 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.6157470 0.3277255 -1.0897239 1.6596029 1.5294129 0.8057784 -0.2027938 2.4249808 -1.7649910
2.4917124 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.4236082 1.1163897 -1.0477428 1.6596029 1.5294129 0.8057784 0.4406159 1.9768681 -1.8411020
1.2349731 -0.4872402 1.0149946 -0.2723291 1.1935426 0.0830689 1.1163897 -1.0547238 1.6596029 1.5294129 0.8057784 0.4406159 1.0736410 -1.6671342
0.6954781 -0.4872402 1.0149946 -0.2723291 1.1935426 0.1698871 1.1163897 -1.0239029 1.6596029 1.5294129 0.8057784 0.2128922 1.0722407 -1.1343578
2.4632989 -0.4872402 1.0149946 -0.2723291 1.1935426 -1.3316418 0.9742880 -0.9936043 1.6596029 1.5294129 0.8057784 0.4406159 0.9966217 -1.5475313
4.4080076 -0.4872402 1.0149946 -0.2723291 1.1935426 -1.0726103 0.5977186 -1.0389097 1.6596029 1.5294129 0.8057784 -0.2980894 2.0622896 -1.5257853
7.4762471 -0.4872402 1.0149946 -0.2723291 1.1935426 -0.8562763 1.1163897 -1.1253414 1.6596029 1.5294129 0.8057784 0.3099404 1.4461347 -1.9063399
1.9883261 -0.4872402 1.0149946 -0.2723291 0.9001297 -3.0551979 1.1163897 -1.2427839 1.6596029 1.5294129 0.8057784 0.1483760 1.4965474 -1.1561037
0.9693115 -0.4872402 1.0149946 -0.2723291 0.9001297 -0.9630200 1.1163897 -1.1919222 1.6596029 1.5294129 0.8057784 -0.2692816 -0.0732473 0.5835743
0.4406611 -0.4872402 1.0149946 -0.2723291 0.3650828 -0.9502108 1.0417863 -1.1114268 1.6596029 1.5294129 0.8057784 -0.4604205 1.9250551 -0.5798354
1.2584688 -0.4872402 1.0149946 -0.2723291 0.3650828 0.8075032 1.1163897 -1.1062979 1.6596029 1.5294129 0.8057784 -1.9422126 0.9980220 0.5400824
5.5248535 -0.4872402 1.0149946 -0.2723291 0.3650828 -0.7509558 1.1163897 -1.1312301 1.6596029 1.5294129 0.8057784 -3.8783565 -0.3561184 -0.8190411
1.2134072 -0.4872402 1.0149946 -0.2723291 0.3650828 0.5299694 1.1163897 -1.0768541 1.6596029 1.5294129 0.8057784 -3.5229148 1.1996727 -0.5798354
1.7668310 -0.4872402 1.0149946 -0.2723291 0.3650828 -2.3578052 1.1163897 -1.0643168 1.6596029 1.5294129 0.8057784 -3.5914839 3.0411357 -0.5037245
2.9113695 -0.4872402 1.0149946 -0.2723291 0.3650828 -1.6077524 1.1163897 -1.0474579 1.6596029 1.5294129 0.8057784 -1.5959718 1.0400326 -0.6776923
4.8982568 -0.4872402 1.0149946 -0.2723291 1.1935426 -2.5129395 1.1163897 -1.0147848 1.6596029 1.5294129 0.8057784 -2.9399686 3.4066275 -1.6888801
1.6823810 -0.4872402 1.0149946 -0.2723291 1.0727255 0.2125846 1.1163897 -0.9309651 1.6596029 1.5294129 0.8057784 -3.6083523 2.2961484 -1.6671342
0.8394627 -0.4872402 1.0149946 -0.2723291 1.0727255 0.7078757 0.7895559 -0.9381836 1.6596029 1.5294129 0.8057784 -3.6705683 1.8396336 -1.6345152
2.5957053 -0.4872402 1.0149946 -0.2723291 1.0727255 -1.3956881 0.7291627 -1.0198662 1.6596029 1.5294129 0.8057784 -2.5117955 1.9586635 -1.3191985
8.1288391 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.4663057 1.1163897 -0.9462094 1.6596029 1.5294129 0.8057784 -3.7266503 1.1156516 -1.4931663
0.9531748 -0.4872402 1.0149946 -0.2723291 1.4092873 0.7676522 0.2815424 -0.9502935 1.6596029 1.5294129 0.8057784 -3.3761377 1.4125262 -1.5366583
0.8688993 -0.4872402 1.0149946 -0.2723291 1.4092873 0.1798499 1.1163897 -0.9194726 1.6596029 1.5294129 0.8057784 -0.4154016 0.3314545 -0.6342003
0.3963319 -0.4872402 1.0149946 -0.2723291 1.4092873 -0.3965665 0.9494202 -0.9120166 1.6596029 1.5294129 0.8057784 -0.4019288 0.4266784 -0.9060250
0.9806002 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.9060900 0.6758745 -0.8756394 1.6596029 1.5294129 0.8057784 -0.7133373 0.2026221 -0.1884078
0.3995673 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.2585112 0.5870610 -0.8421115 1.6596029 1.5294129 0.8057784 -3.8792328 1.4895456 -0.9930089
0.6020542 -0.4872402 1.0149946 -0.2723291 0.2528955 -1.0242198 0.0719425 -0.8223082 1.6596029 1.5294129 0.8057784 -3.8668553 0.6311298 -1.1778497
1.4237880 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.5531240 0.9529728 -0.8953952 1.6596029 1.5294129 0.8057784 -3.8227126 1.6435843 -1.5475313
1.0037355 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.6370957 -0.3152846 -0.8536040 1.6596029 1.5294129 0.8057784 -3.6368314 0.4252781 -1.3409445
3.9584024 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.1176094 0.3596983 -0.9175730 1.6596029 1.5294129 0.8057784 -3.7006904 0.2614369 -1.2648336
0.4363851 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.1304187 0.3383831 -0.8830478 1.6596029 1.5294129 0.8057784 -2.8473018 1.2416833 -1.2539606
0.6656207 -0.4872402 1.0149946 -0.2723291 1.0727255 0.1357291 0.9600779 -0.8675661 1.6596029 1.5294129 0.8057784 -3.2417380 1.6001734 -1.4170554
0.5671779 -0.4872402 1.0149946 -0.2723291 0.2528955 0.0901851 0.6225864 -0.8274371 1.6596029 1.5294129 0.8057784 -2.9927645 0.6983467 -0.8734060
0.7497230 -0.4872402 1.0149946 -0.2723291 0.2528955 0.7804615 0.9138948 -0.8105782 1.6596029 1.5294129 0.8057784 -3.0159860 0.9854189 -0.9168980
0.3290719 -0.4872402 1.0149946 -0.2723291 0.2528955 0.1997754 0.2211492 -0.7572945 1.6596029 1.5294129 0.8057784 -2.8339385 -0.0872508 -0.6994382
0.2287434 -0.4872402 1.0149946 -0.2723291 1.3661384 0.2154311 0.6865322 -0.7024911 1.6596029 1.5294129 0.8057784 -2.8094026 0.4994967 -0.8951520
1.1974449 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.1090699 0.9387626 -0.7469417 1.6596029 1.5294129 0.8057784 -2.8045831 0.3524598 -1.1778497
0.8773861 -0.4872402 1.0149946 -0.2723291 1.5991427 0.4901184 0.9245525 -0.7932444 1.6596029 1.5294129 0.8057784 -2.7035916 1.4867449 -0.9930089
1.2564343 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2510124 0.8783694 -0.8512296 1.6596029 1.5294129 0.8057784 -3.6057234 0.7557611 -1.4061824
1.3443720 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.1887719 1.1163897 -0.8932106 1.6596029 1.5294129 0.8057784 -3.8047489 1.9320568 -1.5040393
1.1700894 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.4976172 0.6865322 -0.9376612 1.6596029 1.5294129 0.8057784 -3.1515905 2.9921233 -1.5366583
0.6716359 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.9359783 0.8996847 -0.9392759 1.6596029 1.5294129 0.8057784 0.4406159 1.4321312 -1.0582468
2.1435191 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.6641375 0.8463965 -0.9160058 1.6596029 1.5294129 0.8057784 0.3809192 1.3243041 -1.3083256
0.7104138 -0.4872402 1.0149946 -0.2723291 1.5991427 0.1727336 1.0169185 -0.8215484 1.6596029 1.5294129 0.8057784 0.3207844 0.9616129 -0.5907084
0.2386602 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.0934142 1.1163897 -0.8501848 1.6596029 1.5294129 0.8057784 0.4273621 0.5513097 -0.4493595
0.7385901 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2851704 1.1163897 -0.8627221 1.6596029 1.5294129 0.8057784 0.3292186 0.8677893 -0.7755492
1.0682704 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.6129005 0.9956033 -0.9020438 1.6596029 1.5294129 0.8057784 -1.2722954 1.5595632 -1.2757066
0.8205824 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2481659 0.9316575 -0.8582106 1.6596029 1.5294129 0.8057784 -3.4351771 1.5861699 -1.1669767
0.3109379 -0.4872402 1.0149946 -0.2723291 1.5991427 0.0802224 0.9884982 -0.8182715 1.6596029 1.5294129 0.8057784 -0.4235072 0.7193520 -0.8299141
0.7337433 -0.4872402 1.0149946 -0.2723291 1.5991427 -0.0478701 0.9956033 -0.7584343 1.6596029 1.5294129 0.8057784 0.3488254 0.5303045 -1.0799928
0.6644814 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.1418047 1.0702067 -0.7282307 1.6596029 1.5294129 0.8057784 0.4406159 0.7669640 -0.9168980
0.4548586 -0.4872402 1.0149946 -0.2723291 1.3661384 0.1883894 1.0559965 -0.7646079 1.6596029 1.5294129 0.8057784 -0.5746657 0.9322055 -1.0365009
0.3608882 -0.4872402 1.0149946 -0.2723291 1.3661384 0.6609085 0.8535016 -0.6987869 1.6596029 1.5294129 0.8057784 -3.9033305 0.6703397 -0.9930089
0.2124754 -0.4872402 1.0149946 -0.2723291 1.3661384 0.5271229 1.0524439 -0.6837801 1.6596029 1.5294129 0.8057784 -0.0151600 0.7109499 -0.7972951
0.1716722 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0175994 0.8250813 -0.6776064 1.6596029 1.5294129 0.8057784 0.3112548 0.6465337 -0.6994382
0.5388063 -0.4872402 1.0149946 -0.2723291 1.3661384 1.5774816 1.0915219 -0.6374774 1.6596029 1.5294129 0.8057784 0.2102634 0.5723150 -0.5145975
0.6859357 -0.4872402 1.0149946 -0.2723291 1.3661384 0.6310202 0.9067897 -0.6168668 1.6596029 1.5294129 0.8057784 -3.8336662 0.8481844 -0.8299141
0.1324002 -0.4872402 1.0149946 -0.2723291 1.3661384 0.3421004 0.6367966 -0.6455032 1.6596029 1.5294129 0.8057784 -3.3490825 0.7669640 -0.9168980
0.1226880 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.4392640 0.6865322 -0.5767378 1.6596029 1.5294129 0.8057784 -3.7920428 0.8901949 -1.0691198
0.5332828 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.4961940 0.4165390 -0.4824229 1.6596029 1.5294129 0.8057784 -3.8684983 0.6003221 -0.9821359
0.4811585 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0232924 0.5373254 -0.4805707 1.6596029 1.5294129 0.8057784 -0.9251783 0.5008971 -0.8299141
0.3705900 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.2898227 0.5621932 -0.5117241 1.6596029 1.5294129 0.8057784 0.4406159 0.2866432 -0.2753917
0.1393478 -0.4872402 1.0149946 -0.2723291 1.3661384 0.5925924 0.7611355 -0.5687120 1.6596029 1.5294129 0.8057784 -1.1111691 0.5275038 -0.6668193
0.0092526 -0.4872402 1.0149946 -0.2723291 1.3661384 0.1300361 0.7042949 -0.5831490 1.6596029 1.5294129 0.8057784 0.3807001 0.2796414 -0.5254704
0.3535872 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0460644 0.5124576 -0.5036983 1.6596029 1.5294129 0.8057784 0.4406159 0.1872182 -0.3297567
0.2566546 -0.4872402 1.0149946 -0.2723291 1.3661384 0.3250214 0.7575830 -0.4717851 1.6596029 1.5294129 0.8057784 0.4068791 -0.3309120 -0.2536457
0.4912834 -0.4872402 1.0149946 -0.2723291 0.8656106 -0.1076467 -0.1127897 -0.3949464 1.6596029 1.5294129 0.8057784 0.4406159 0.0793911 -0.1231699
-0.0523073 -0.4872402 1.0149946 -0.2723291 0.8656106 -0.7481093 -0.7238268 -0.3459843 1.6596029 1.5294129 0.8057784 -0.2439790 0.2068231 -0.2862647
0.0187706 -0.4872402 1.0149946 -0.2723291 0.8656106 -0.4734220 0.5728508 -0.4385897 1.6596029 1.5294129 0.8057784 -3.6657487 0.6297295 -0.3841216
0.0940246 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.4008362 0.9209999 -0.5958763 1.6596029 1.5294129 0.8057784 -0.2780445 1.2136763 -0.3732486
1.3907009 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.5104265 0.0861526 -0.4210659 1.6596029 1.5294129 0.8057784 0.1321648 0.7669640 -0.3732486
1.0999857 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.8135788 -0.4218608 -0.4612898 1.6596029 1.5294129 0.8057784 0.4406159 0.2950453 -0.2645187
0.0854807 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.1674232 0.5479830 -0.3617035 1.6596029 1.5294129 0.8057784 0.4406159 0.5092992 -0.2862647
0.0493965 -0.4872402 1.0149946 -0.2723291 -0.1958536 -0.0791817 0.7860033 -0.3304076 1.6596029 1.5294129 0.8057784 0.4234189 0.0303788 -0.3188837
-0.0052134 -0.4872402 1.0149946 -0.2723291 0.2183763 0.2168544 0.2282543 -0.4267172 1.6596029 1.5294129 0.8057784 0.4019500 0.2390312 0.0725439
0.1201373 -0.4872402 1.0149946 -0.2723291 0.5117892 0.9896793 -0.0346338 -0.5993905 1.6596029 1.5294129 0.8057784 0.1972287 -0.1390638 0.7901611
0.5164498 -0.4872402 1.0149946 -0.2723291 0.2528955 -1.2206283 0.9529728 -0.6483526 1.6596029 1.5294129 0.8057784 -0.0448441 0.7683643 -0.9495170
0.3231508 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.1745394 1.0240236 -0.7546351 1.6596029 1.5294129 0.8057784 -0.5905484 1.6029741 -1.0038819
0.1462396 -0.4872402 1.0149946 -0.2723291 0.5117892 0.2837472 0.8890270 -0.7074776 1.6596029 1.5294129 0.8057784 0.4330580 0.8439833 -0.6342003
1.3264915 -0.4872402 1.0149946 -0.2723291 0.5117892 -1.3956881 1.0204711 -0.8046419 1.6596029 1.5294129 0.8057784 -0.0788000 1.7164026 -1.1452307
0.7695683 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.1418047 0.9991558 -0.7714940 1.6596029 1.5294129 0.8057784 0.2522154 0.7529604 -0.8625331
1.2463082 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.0791817 0.6900847 -0.8756394 1.6596029 1.5294129 0.8057784 0.2918671 0.0639872 -0.1231699
0.2569871 -0.4872402 1.0149946 -0.2723291 -0.1958536 -0.0606794 -0.1376575 -0.1761129 1.6596029 1.5294129 0.8057784 0.4406159 -0.2678962 0.0507979
0.2435210 -0.4872402 1.0149946 -0.2723291 -0.1958536 0.6623317 0.2247018 -0.2200411 1.6596029 1.5294129 0.8057784 0.3986639 -0.6880018 0.1269088
0.2461926 -0.4872402 1.0149946 -0.2723291 -0.1958536 1.1049625 0.2993051 -0.1825715 1.6596029 1.5294129 0.8057784 0.4228712 -0.7902275 0.2682577
-0.0924419 -0.4872402 1.0149946 -0.2723291 -0.1958536 -0.7438395 -1.0044776 0.1440166 1.6596029 1.5294129 0.8057784 0.3970209 -0.3127075 -0.0796779
-0.1435735 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.5887052 -0.9476370 -0.0337381 1.6596029 1.5294129 0.8057784 0.1539623 0.0961953 -0.2101538
0.0069925 -0.4872402 1.0149946 -0.2723291 0.2442657 0.0389481 -0.5923828 0.0933924 1.6596029 1.5294129 0.8057784 0.3499208 -0.2903018 -0.1449159
0.2416108 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.2428554 0.3987763 -0.1183177 1.6596029 1.5294129 0.8057784 0.3943920 0.3258531 -0.3732486
0.1420845 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.5403147 -0.5461998 -0.3052380 1.6596029 1.5294129 0.8057784 0.3455394 -0.1684712 -0.2101538
-0.4025630 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.1822006 0.8570542 -0.9375187 -0.6373311 1.7964164 0.7595879 0.4207900 0.7571615 -0.7972951
-0.3987834 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.2391306 1.0559965 -0.9686246 -0.6373311 1.7964164 0.7595879 -0.1382776 1.5847695 -1.6888801
-0.3959828 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.6959939 1.0453389 -0.9367114 -0.6373311 1.7964164 0.7595879 -0.4189067 2.3843705 -1.5692773
-0.4078085 -0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 1.0737592 -0.9151035 -0.6373311 1.7964164 0.7595879 0.3662415 0.7585618 -0.9712629
-0.4071598 -0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 0.5302203 -0.8002729 -0.6373311 1.7964164 0.7595879 0.4406159 0.0975957 -0.2645187
-0.3999530 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.8221183 -0.5177794 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 -0.0900515 -0.0796779
-0.3875994 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.5104265 -0.9227692 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 0.1312041 0.2138927
-0.3992926 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.8747785 -1.4130199 -0.4732098 -0.4076377 -0.1022751 0.3438730 0.4010737 0.6927453 0.0616709
-0.3864333 -0.4872402 -0.2108898 -0.2723291 0.2615253 -1.2732886 0.1536509 -0.4732098 -0.4076377 -0.1022751 0.3438730 0.4406159 1.1884699 -0.3080107
-0.3889003 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.6982955 0.0719425 -0.4285218 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2026221 -0.4602325
-0.3923020 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3780642 -0.1163422 -0.6581830 -0.4076377 -0.1022751 0.3438730 0.4406159 0.0373805 -0.1449159
-0.3994275 -0.4872402 -0.2108898 -0.2723291 0.2615253 -1.0185268 0.1749662 -0.6625521 -0.4076377 -0.1022751 0.3438730 0.4282384 0.3426573 -0.5472164
-0.3940157 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3666782 0.3952238 -0.6158695 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2348302 -0.6233273
-0.4128204 -0.4872402 0.1156240 -0.2723291 0.1579678 0.4388814 0.0186544 -0.6251775 -0.9818712 -0.8024176 1.1753027 0.3868341 -0.4177339 -0.0144400
-0.4148387 -0.4872402 0.1156240 -0.2723291 0.1579678 -0.2343159 0.2886475 -0.7159308 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.5003546 -0.2101538
-0.4130378 -0.4872402 0.1156240 -0.2723291 0.1579678 0.9839863 0.7966610 -0.7729187 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.9820757 0.1486548
-0.4073609 -0.4872402 0.1156240 -0.2723291 0.1579678 0.7249547 0.7362677 -0.6677760 -0.9818712 -0.8024176 1.1753027 0.4028263 -0.8644462 -0.0579320
-0.4145899 -0.4872402 0.1156240 -0.2723291 0.1579678 -0.3624084 0.4343017 -0.6126402 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.6683969 -1.1561037

1.3.2. Create a categorical variable (crime rate)

# summary of the scaled crime rate
summary(boston_scaled$crim)
##      Min.   1st Qu.    Median      Mean   3rd Qu.      Max. 
## -0.419367 -0.410563 -0.390280  0.000000  0.007389  9.924110
# create a quantile vector of crim and print it
bins <- quantile(boston_scaled$crim)
knitr::kable(bins, caption="Quantiles of crim") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Quantiles of crim
x
0% -0.4193669
25% -0.4105633
50% -0.3902803
75% 0.0073892
100% 9.9241096
# create a categorical variable 'crime'
crime <- cut(boston_scaled$crim, breaks = bins, include.lowest = TRUE, labels = c("low", "med_low", "med_high", "high"))

# look at the table of the new factor crime
knitr::kable(table(crime), caption="Categorical variables of crime") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Categorical variables of crime
crime Freq
low 127
med_low 126
med_high 126
high 127
# remove original crim from the dataset
boston_scaled <- dplyr::select(boston_scaled, -crim)

# add the new categorical value to scaled data
boston_scaled <- data.frame(boston_scaled, crime)

knitr::kable(boston_scaled, caption = "Scaled Boston data set with caterogical variable crime") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center",) %>% 
  scroll_box(width = "100%", height = "300px") # the data frame head
Scaled Boston data set with caterogical variable crime
zn indus chas nox rm age dis rad tax ptratio black lstat medv crime
0.2845483 -1.2866362 -0.2723291 -0.1440749 0.4132629 -0.1198948 0.1400750 -0.9818712 -0.6659492 -1.4575580 0.4406159 -1.0744990 0.1595278 low
-0.4872402 -0.5927944 -0.2723291 -0.7395304 0.1940824 0.3668034 0.5566090 -0.8670245 -0.9863534 -0.3027945 0.4406159 -0.4919525 -0.1014239 low
-0.4872402 -0.5927944 -0.2723291 -0.7395304 1.2814456 -0.2655490 0.5566090 -0.8670245 -0.9863534 -0.3027945 0.3960351 -1.2075324 1.3229375 low
-0.4872402 -1.3055857 -0.2723291 -0.8344581 1.0152978 -0.8090878 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4157514 -1.3601708 1.1815886 low
-0.4872402 -1.3055857 -0.2723291 -0.8344581 1.2273620 -0.5106743 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4406159 -1.0254866 1.4860323 low
-0.4872402 -1.3055857 -0.2723291 -0.8344581 0.2068916 -0.3508100 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4101651 -1.0422909 0.6705582 low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3880270 -0.0701592 0.8384142 -0.5224844 -0.5769480 -1.5037485 0.4263763 -0.0312367 0.0399249 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.1603069 0.9778406 1.0236249 -0.5224844 -0.5769480 -1.5037485 0.4406159 0.9097999 0.4965904 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.9302853 1.1163897 1.0861216 -0.5224844 -0.5769480 -1.5037485 0.3281233 2.4193794 -0.6559463 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3994130 0.6154813 1.3283202 -0.5224844 -0.5769480 -1.5037485 0.3289995 0.6227277 -0.3949946 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 0.1314594 0.9138948 1.2117800 -0.5224844 -0.5769480 -1.5037485 0.3926395 1.0918456 -0.8190411 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3922967 0.5089051 1.1547920 -0.5224844 -0.5769480 -1.5037485 0.4406159 0.0863929 -0.3949946 med_low
0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.5630867 -1.0506607 0.7863653 -0.5224844 -0.5769480 -1.5037485 0.3705134 0.4280788 -0.0905509 med_low
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4776917 -0.2406812 0.4333252 -0.6373311 -0.6006817 1.1753027 0.4406159 -0.6151835 -0.2318998 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 0.5657458 0.3166900 -0.6373311 -0.6006817 1.1753027 0.2557205 -0.3351131 -0.4711055 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6413655 -0.4289659 0.3341188 -0.6373311 -0.6006817 1.1753027 0.4265954 -0.5857761 -0.2862647 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4976172 -1.3952572 0.3341188 -0.6373311 -0.6006817 1.1753027 0.3305330 -0.8504426 0.0616709 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4193385 0.4662746 0.2198106 -0.6373311 -0.6006817 1.1753027 0.3294377 0.2824421 -0.5472164 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.1793541 -1.1359217 0.0006921 -0.6373311 -0.6006817 1.1753027 -0.7413783 -0.1348628 -0.2536457 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.7936533 0.0328645 0.0006921 -0.6373311 -0.6006817 1.1753027 0.3754425 -0.1922772 -0.4711055 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.0171035 1.0488914 0.0013569 -0.6373311 -0.6006817 1.1753027 0.2179309 1.1716657 -0.9712629 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4549197 0.7327152 0.1031753 -0.6373311 -0.6006817 1.1753027 0.3927490 0.1648126 -0.3188837 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2030044 0.8215287 0.0863639 -0.6373311 -0.6006817 1.1753027 0.4406159 0.8495847 -0.7972951 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 1.1163897 0.1425445 -0.6373311 -0.6006817 1.1753027 0.4147656 1.0120256 -0.8734060 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.5132730 0.9067897 0.2871038 -0.6373311 -0.6006817 1.1753027 0.4124654 0.5106995 -0.7538032 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.9758293 0.6083763 0.3132232 -0.6373311 -0.6006817 1.1753027 -0.5833190 0.5401069 -0.9386440 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 0.7717932 0.4212153 -0.6373311 -0.6006817 1.1753027 0.2213265 0.3020471 -0.6450733 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.3382132 0.7185050 0.3126533 -0.6373311 -0.6006817 1.1753027 -0.5508966 0.6479340 -0.8407871 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 0.2994029 0.9174474 0.3132707 -0.6373311 -0.6006817 1.1753027 0.3424724 0.0205763 -0.4493595 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 0.5541647 0.6652169 0.2108350 -0.6373311 -0.6006817 1.1753027 0.2580207 -0.0942525 -0.1666618 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8135788 0.9067897 0.2079856 -0.6373311 -0.6006817 1.1753027 0.0382932 1.3929213 -1.0691198 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.3026319 1.1163897 0.1804414 -0.6373311 -0.6006817 1.1753027 0.2196834 0.0541848 -0.8734060 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.4762685 0.4769322 0.0925851 -0.6373311 -0.6006817 1.1753027 -1.3590472 2.1085012 -1.0147549 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8306578 0.9387626 -0.0037245 -0.6373311 -0.6006817 1.1753027 0.0229582 0.7977717 -1.0256279 med_high
-0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.2684739 1.0062609 -0.0167367 -0.6373311 -0.6006817 1.1753027 -1.1869674 1.0764418 -0.9821359 med_high
-0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.5004637 -0.0133185 -0.2064589 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.4163335 -0.3949946 low
-0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6314027 -0.2548913 -0.1981007 -0.5224844 -0.7668172 0.3438730 0.2287748 -0.1740726 -0.2753917 med_low
-0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6185935 -0.9618471 0.0660857 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.5437656 -0.1666618 low
-0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.4534965 -1.3632843 0.0248170 -0.5224844 -0.7668172 0.3438730 0.4026072 -0.3533177 0.2356387 med_low
2.7285450 -1.1933466 -0.2723291 -1.0933517 0.4417279 -1.6616978 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4267049 -1.1669222 0.8988910 low
2.7285450 -1.1933466 -0.2723291 -1.0933517 1.0523023 -1.8748503 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4265954 -1.4946046 1.3446835 low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 0.6907967 -2.3331282 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.3147600 -1.0941039 0.4422255 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1645767 -2.2016841 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.2924148 -0.9582698 0.3008766 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1048002 -2.2052367 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.4138893 -0.7300124 0.2356387 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.3069017 -1.0151352 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.3583550 -0.4345381 -0.1449159 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.8576995 -1.2353928 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 -0.3421149 -0.3515026 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.7096815 -1.2531555 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.2096238 -0.2753917 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.3624084 0.6012712 0.8996287 -0.7521778 -1.0397541 -0.2566040 0.3950493 0.8607875 -0.6450733 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -1.2604793 0.9494202 0.9853955 -0.7521778 -1.0397541 -0.2566040 0.4406159 2.5426103 -0.8842790 med_low
-0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.9715595 -0.2335761 1.0887811 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.4966960 -0.3406296 med_low
0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4577662 -0.8126404 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4259382 0.1115992 -0.3080107 med_low
0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.2414322 -0.1980507 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4085221 -0.4513423 -0.2210268 low
0.4131797 -0.8012385 -0.2723291 -0.9984241 0.3221749 -1.6865656 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -1.0324884 0.2682577 low
0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4079525 -1.6759080 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -0.5913775 0.0942899 low
2.7285450 -1.0402932 -0.2723291 -1.2486880 -0.5645100 -0.7451421 1.6738568 -0.7521778 0.3605309 1.2214933 0.4406159 0.3006467 -0.3949946 low
3.3717021 -1.4455202 -0.2723291 -1.3090965 1.3725336 -1.6581453 2.3277455 -0.5224844 -1.0812880 -0.2566040 0.4299910 -1.0983050 1.3990484 low
3.1573164 -1.5154874 -0.2723291 -1.2486880 0.1399989 -1.1678945 2.5609210 -0.8670245 -0.5650812 -0.5337472 0.4406159 -0.9638712 0.2356387 low
3.8004735 -1.4309437 -0.2723291 -1.2400582 0.7562662 -0.9973725 2.1511780 -0.5224844 -0.9032856 -1.5499390 0.3968018 -1.2187352 0.9858749 low
0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.1987347 -1.3988097 1.9089794 -0.1779443 -0.7371501 0.5748257 0.3724850 -0.8112328 0.0834169 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.5090032 -0.7593522 1.4897384 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.4807497 -0.3188837 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.7737278 -0.0843694 1.6290739 -0.1779443 -0.7371501 0.5748257 0.4210091 0.0695886 -0.4167406 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.4534965 0.8819220 1.4358374 -0.1779443 -0.7371501 0.5748257 0.2344707 0.2502341 -0.7103112 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 0.2438961 -0.0275287 1.6291213 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.8294374 -0.0361860 med_low
0.5846882 -0.8755787 -0.2723291 -0.8776070 0.6794107 -0.8943488 1.9878602 -0.1779443 -0.7371501 0.5748257 0.4261573 -0.4415399 0.2682577 med_low
0.2631097 -1.4221978 -0.2723291 -1.1960462 1.1661623 -0.3223896 2.5776850 -0.7521778 -1.1406221 0.0667298 0.4005260 -0.6445909 1.1380967 low
2.9429307 -1.1321252 -0.2723291 -1.3522454 0.0076366 -1.8037995 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -1.1179099 0.1051628 low
2.9429307 -1.1321252 -0.2723291 -1.3522454 -0.7082582 -1.3313114 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -0.3379138 -0.3406296 low
0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5787425 -1.6759080 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4330580 -0.6375891 -0.0579320 low
0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.9829455 -1.1288166 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 0.0611865 -0.5580894 med_low
0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5687797 -1.2638131 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 -0.5409648 -0.1775348 med_low
-0.4872402 -0.0476329 -0.2723291 -1.2227986 0.1883894 -2.2016841 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2963581 -0.8308377 0.1812738 med_low
-0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.4606127 -1.8144571 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2219837 -0.3883265 -0.0905509 med_low
-0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.3125947 -2.1590536 0.7086718 -0.6373311 -0.6125485 0.3438730 0.3750043 -0.9988800 0.0290519 med_low
-0.4872402 -0.0476329 -0.2723291 -1.2227986 -0.0564097 -2.2158943 0.7086718 -0.6373311 -0.6125485 0.3438730 0.2245030 -0.7160089 0.0942899 med_low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0165586 -2.2229994 0.2167712 -0.5224844 -0.0607412 0.1129203 0.4189279 -0.8224356 0.1704008 low
-0.4872402 0.2468126 -0.2723291 -1.0156836 0.0019436 -0.8375082 0.3360184 -0.5224844 -0.0607412 0.1129203 0.2908813 -0.5199596 -0.1231699 med_low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0080191 0.2104916 0.1221238 -0.5224844 -0.0607412 0.1129203 0.1860561 -0.0956529 -0.2753917 med_low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.2058509 -0.8090878 0.1403124 -0.5224844 -0.0607412 0.1129203 0.3317379 -0.3337127 -0.1884078 med_low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.0749119 -0.5284370 0.5789293 -0.5224844 -0.0607412 0.1129203 0.3256039 -0.0438399 -0.1449159 low
-0.4872402 0.2468126 -0.2723291 -1.0156836 -0.5844355 -1.1359217 0.3360184 -0.5224844 -0.0607412 0.1129203 0.4314149 -0.4975539 -0.2427728 med_low
0.5846882 -0.9149352 -0.2723291 -1.1106113 0.6295970 -1.2460504 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -1.0310881 0.5944473 low
0.5846882 -0.9149352 -0.2723291 -1.1106113 0.4758859 0.0648374 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4267049 -0.7608201 0.1486548 low
0.5846882 -0.9149352 -0.2723291 -1.1106113 0.0247156 -1.2922335 0.7625253 -0.6373311 -0.7549503 0.2514920 0.4406159 -0.8308377 0.2465117 low
0.5846882 -0.9149352 -0.2723291 -1.1106113 -0.1674232 -0.7771150 0.7625253 -0.6373311 -0.7549503 0.2514920 0.3720469 -0.7202099 0.0399249 low
-0.4872402 -0.9688683 -0.2723291 -0.9121262 0.1485384 -0.7309319 0.4674705 -0.7521778 -0.9566863 0.0205393 0.4406159 -0.4247356 0.1486548 low
-0.4872402 -0.9688683 -0.2723291 -0.9121262 0.4915417 -0.4431760 0.3051974 -0.7521778 -0.9566863 0.0205393 0.3902297 -0.8574444 0.4422255 low
-0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.3837572 -0.8339556 0.3002110 -0.7521778 -0.9566863 0.0205393 0.4306482 0.0289784 -0.0035670 low
-0.4872402 -0.9688683 -0.2723291 -0.9121262 -0.2328927 -0.4183083 -0.0225305 -0.7521778 -0.9566863 0.0205393 0.4214472 -0.5899772 -0.0361860 low
-0.4872402 -1.1262946 -0.2723291 -0.5669346 1.0281070 0.6296915 -0.1773001 -0.8670245 -0.8202179 -0.3027945 0.4406159 -1.0016807 0.1160358 low
-0.4872402 -1.1262946 -0.2723291 -0.5669346 1.1305810 -0.1944981 -0.1807194 -0.8670245 -0.8202179 -0.3027945 0.4314149 -0.9736736 0.6705582 low
-0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1883894 -0.0879219 -0.3337319 -0.8670245 -0.8202179 -0.3027945 0.3889153 -0.5381641 0.0073060 low
-0.4872402 -1.1262946 -0.2723291 -0.5669346 0.1713104 0.1891763 -0.3338269 -0.8670245 -0.8202179 -0.3027945 0.4039216 -0.6235856 -0.0579320 low
0.7133196 0.5689534 -0.2723291 -0.7826793 0.2239706 -0.5319896 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4199137 -0.6291870 0.0399249 low
0.7133196 0.5689534 -0.2723291 -0.7826793 -0.1048002 -1.4094674 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4343724 -0.9022557 0.2682577 low
0.7133196 0.5689534 -0.2723291 -0.7826793 -0.0507166 0.3099628 -0.0855021 -0.6373311 -0.8202179 -0.1180323 0.4406159 -0.2889015 -0.2101538 low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 0.4844254 -0.3827828 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.0143049 -0.8406402 0.6379392 med_low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 -0.1731162 0.0364171 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.3850816 -0.1838751 -0.1231699 med_low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 2.5395987 0.2637797 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -1.1823261 1.7578570 med_low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 2.1852094 -1.1252640 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4037025 -1.2719486 2.3123794 low
-0.4872402 -1.2020925 -0.2723291 -0.9466453 1.6102164 -0.2158134 -0.1423950 -0.8670245 -0.7846174 -0.2104134 0.4406159 -0.9050564 1.1598427 low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 0.6295970 0.4023288 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4171754 -0.4527427 0.5400824 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 0.7064525 0.0968103 -0.4459031 -0.5224844 -0.1438090 1.1291122 0.4261573 -0.6978043 0.4313525 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 0.1713104 0.5977186 -0.5130539 -0.5224844 -0.1438090 1.1291122 -3.1313265 -0.2833001 -0.4276135 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2101207 0.6687695 -0.5130539 -0.5224844 -0.1438090 1.1291122 0.4139988 0.1101988 -0.3515026 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1674232 0.7611355 -0.6525317 -0.5224844 -0.1438090 1.1291122 0.3945016 -0.0452402 -0.2645187 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6171702 0.9991558 -0.8016976 -0.5224844 -0.1438090 1.1291122 0.4093984 0.5345055 -0.3297567 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6385190 0.8286338 -0.7522606 -0.5224844 -0.1438090 1.1291122 0.4271431 0.8411826 -0.3297567 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.2243532 0.5906135 -0.7943366 -0.5224844 -0.1438090 1.1291122 0.3397340 0.2012217 -0.2318998 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 0.2695146 1.0133660 -0.6468804 -0.5224844 -0.1438090 1.1291122 0.4224331 -0.0536423 -0.2971377 med_low
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.0791817 0.8037660 -0.5935968 -0.5224844 -0.1438090 1.1291122 0.3785094 0.4056731 -0.3406296 med_high
-0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1275722 -0.5035693 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4032644 0.0485834 -0.0905509 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 0.6125180 0.4627220 -0.5307201 -0.4076377 0.1409947 -0.3027945 0.4262668 -0.3491166 0.0290519 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5289287 0.8641592 -0.6846349 -0.4076377 0.1409947 -0.3027945 0.4192565 0.4980964 -0.4058676 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.2741669 0.9529728 -0.5922195 -0.4076377 0.1409947 -0.3027945 0.4406159 0.6213273 -0.4167406 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.0436004 0.5550881 -0.7306527 -0.4076377 0.1409947 -0.3027945 0.3512352 -0.3085064 -0.4384865 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5075800 0.6971898 -0.6325385 -0.4076377 0.1409947 -0.3027945 -0.1288575 0.4350805 -0.4602325 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.1546139 0.1394408 -0.5057404 -0.4076377 0.1409947 -0.3027945 0.4011832 -0.0858504 -0.1449159 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.3752177 0.4982475 -0.4975246 -0.4076377 0.1409947 -0.3027945 0.4144370 -0.3295117 -0.3623756 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5872820 0.1607560 -0.6256999 -0.4076377 0.1409947 -0.3027945 -0.1976456 0.3804668 -0.2318998 med_low
-0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.7879603 -0.1198948 -0.4919208 -0.4076377 0.1409947 -0.3027945 0.3814669 0.1340048 -0.3515026 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.5901285 0.0399696 -0.7300828 -0.8670245 -1.3067576 0.2976825 0.3557261 0.2404316 -0.0579320 low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.3994130 0.5515356 -0.7587192 -0.8670245 -1.3067576 0.2976825 0.2299797 0.2264281 -0.2427728 low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.4606127 0.8641592 -0.8111956 -0.8670245 -1.3067576 0.2976825 0.2345802 0.7389569 -0.2210268 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.6100540 1.0098134 -0.8788687 -0.8670245 -1.3067576 0.2976825 0.1493618 1.7864202 -0.5689624 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.5773192 0.9671829 -0.8494724 -0.8670245 -1.3067576 0.2976825 0.2487102 0.6899446 -0.4058676 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.4250315 0.7042949 -0.8558361 -0.8670245 -1.3067576 0.2976825 0.3104881 0.3020471 -0.1231699 med_low
-0.4872402 2.1155211 -0.2723291 0.2270061 -0.9559038 0.9600779 -0.9677698 -0.8670245 -1.3067576 0.2976825 0.0286541 2.0454854 -0.7429302 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.8420438 0.9742880 -0.9530004 -0.6373311 0.1706618 1.2676838 0.3881485 0.6353309 -0.6885653 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.2083149 1.0737592 -0.9415079 -0.6373311 0.1706618 1.2676838 0.4406159 0.3832675 -0.4928515 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.9217458 0.9281050 -0.8620098 -0.6373311 0.1706618 1.2676838 0.4406159 0.7963713 -0.8951520 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.2467426 1.0773117 -0.7961887 -0.6373311 0.1706618 1.2676838 0.4202424 -0.0074307 -0.3623756 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.0588736 1.0346812 -0.7237666 -0.6373311 0.1706618 1.2676838 0.4406159 -0.0550427 -0.3188837 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.1243431 1.0417863 -0.6969823 -0.6373311 0.1706618 1.2676838 0.3185937 -0.2146828 0.0507979 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.6584445 0.9529728 -0.6293092 -0.6373311 0.1706618 1.2676838 0.3506875 0.3328548 -0.4493595 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.7509558 1.0595490 -0.6881492 -0.6373311 0.1706618 1.2676838 -1.0286891 0.6521351 -0.7538032 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.0716829 1.0524439 -0.7998930 -0.6373311 0.1706618 1.2676838 0.4161895 0.6031228 -0.4819785 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.4876545 0.8854745 -0.8681835 -0.6373311 0.1706618 1.2676838 0.2363328 0.5947207 -0.5580894 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 0.2410496 1.0595490 -0.9237941 -0.6373311 0.1706618 1.2676838 0.4097270 0.2712393 -0.5907084 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.6086307 1.0524439 -1.0098459 -0.6373311 0.1706618 1.2676838 0.3873818 1.2136763 -1.0038819 med_low
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.1901952 1.0417863 -1.0097984 -0.6373311 0.1706618 1.2676838 0.4406159 0.8131756 -0.5145975 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -0.1574604 0.8890270 -1.0367727 -0.6373311 0.1706618 1.2676838 0.3440059 1.6113762 -0.9277710 med_high
-0.4872402 1.5674443 -0.2723291 0.5980871 -1.8013144 1.1163897 -1.1186928 -0.6373311 0.1706618 1.2676838 0.4406159 3.0467371 -0.8842790 med_high
-0.4872402 1.2307270 3.6647712 2.7296452 -1.2547863 1.1163897 -1.1746359 -0.5224844 -0.0310742 -1.7347012 0.4406159 1.9838699 -0.9930089 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.1622751 1.1163897 -1.1318000 -0.5224844 -0.0310742 -1.7347012 0.4406159 1.9278558 -0.7538032 high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.9664114 1.0382338 -1.1630958 -0.5224844 -0.0310742 -1.7347012 0.4406159 2.3297568 -1.1669767 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.2200834 1.1163897 -1.1283332 -0.5224844 -0.0310742 -1.7347012 -2.0128627 2.1211044 -0.9495170 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.9345550 1.1163897 -1.0820306 -0.5224844 -0.0310742 -1.7347012 -2.0527336 0.5597119 -0.7538032 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.9336767 0.9636304 -1.1085299 -0.5224844 -0.0310742 -1.7347012 0.3837671 2.3633653 -0.8625331 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.5636316 0.8961321 -1.0758569 -0.5224844 -0.0310742 -1.7347012 0.0034610 2.1939227 -0.5145975 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.9786758 0.9352101 -1.0777090 -0.5224844 -0.0310742 -1.7347012 -0.0528401 1.2318808 -0.7755492 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.2314694 1.0204711 -1.0338758 -0.5224844 -0.0310742 -1.7347012 0.1766361 0.2026221 -0.1122969 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.2533631 1.1163897 -1.0464131 -0.5224844 -0.0310742 -1.7347012 -0.1651137 0.0877932 -0.3188837 med_high
-0.4872402 1.2307270 3.6647712 2.7296452 -1.8112772 0.6900847 -1.0375800 -0.5224844 -0.0310742 -1.7347012 -0.1467118 -0.0746476 -0.7864221 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -0.8192718 1.0631016 -1.0314063 -0.5224844 -0.0310742 -1.7347012 -1.0375614 0.4392816 -0.3406296 med_high
-0.4872402 1.2307270 3.6647712 2.7296452 -0.2215067 0.9742880 -0.9714740 -0.5224844 -0.0310742 -1.7347012 -0.3905371 0.3454580 -0.6015814 med_high
-0.4872402 1.2307270 3.6647712 2.7296452 -0.1887719 0.4982475 -0.9733261 -0.5224844 -0.0310742 -1.7347012 -2.9428165 0.3314545 -0.7538032 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 -1.4412321 0.9032372 -0.9776477 -0.5224844 -0.0310742 -1.7347012 -2.9360253 0.4882939 -1.0256279 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 0.9370190 1.0240236 -0.9107344 -0.5224844 -0.0310742 -1.7347012 0.0740016 -1.1291127 2.0405547 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.3111714 1.1163897 -0.9677223 -0.5224844 -0.0310742 -1.7347012 -0.0304949 -0.8714479 0.1921468 med_high
-0.4872402 1.2307270 -0.2723291 2.7296452 0.3207517 1.1163897 -0.9636382 -0.5224844 -0.0310742 -1.7347012 0.0836407 -0.7370141 0.0834169 med_high
-0.4872402 1.2307270 3.6647712 0.4341211 -0.0492934 0.8535016 -0.9482040 -0.5224844 -0.0310742 -1.7347012 -0.1944691 -1.0016807 0.4857174 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 1.7141136 0.7895559 -0.8662839 -0.5224844 -0.0310742 -1.7347012 0.1944903 -1.5296134 2.9865046 med_high
-0.4872402 1.2307270 3.6647712 0.4341211 2.1595909 1.0524439 -0.8331359 -0.5224844 -0.0310742 -1.7347012 0.3607647 -1.5030067 2.9865046 med_high
-0.4872402 1.2307270 3.6647712 0.4341211 2.9751133 0.8996847 -0.7755306 -0.5224844 -0.0310742 -1.7347012 0.3480587 -1.3069574 2.9865046 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.6129005 0.8250813 -0.6520568 -0.5224844 -0.0310742 -1.7347012 0.4210091 -0.1418645 0.0181789 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.2613577 0.8677118 -0.7178779 -0.5224844 -0.0310742 -1.7347012 -1.2762386 -0.3981289 0.2682577 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 2.3403437 0.9813931 -0.8306664 -0.5224844 -0.0310742 -1.7347012 0.1382988 -1.2537440 2.9865046 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.5801657 0.3774611 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -1.4137053 -0.0718469 0.1377818 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 0.0489109 0.9778406 -0.8049744 -0.5224844 -0.0310742 -1.7347012 -0.6526548 -0.2174835 0.1377818 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 0.1670406 0.9458677 -0.7278033 -0.5224844 -0.0310742 -1.7347012 -0.2917364 -0.1866758 -0.0253130 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.5830122 0.9245525 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -0.7052317 0.2488337 -0.5580894 med_high
-0.4872402 1.2307270 -0.2723291 0.4341211 -0.5758960 1.0204711 -0.6678710 -0.5224844 -0.0310742 -1.7347012 -0.0935872 -0.0872508 -0.3732486 med_high
-0.4872402 -1.0330050 -0.2723291 -0.3857090 -1.0142570 0.7078474 -0.5693769 -0.5224844 -0.6659492 -0.8570810 0.4406159 0.2852429 0.0616709 med_low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 0.1869661 0.5515356 -0.5455370 -0.5224844 -0.6659492 -0.8570810 0.4252810 -0.5059560 0.1160358 med_low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 -0.6057842 0.0044442 -0.5191326 -0.5224844 -0.6659492 -0.8570810 0.4004165 -0.4219349 0.0073060 med_low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 0.3719887 -1.2602606 -0.3147360 -0.5224844 -0.6659492 -0.8570810 0.3755520 -1.0254866 0.7466691 low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 -0.3766409 -0.7593522 -0.1140436 -0.5224844 -0.6659492 -0.8570810 0.4004165 -0.3561184 0.0725439 low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 0.0432179 0.1714136 -0.2267846 -0.5224844 -0.6659492 -0.8570810 0.4263763 -0.8910529 0.2247657 low
-0.4872402 -1.0330050 -0.2723291 -0.3857090 0.8188892 0.2069391 -0.4177891 -0.5224844 -0.6659492 -0.8570810 0.3789476 -0.8028307 0.8010341 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 0.9896793 -0.3614676 -0.4587729 -0.7521778 -1.2770905 -0.3027945 0.4406159 -1.0660969 1.5947622 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 2.1069307 0.5231153 -0.5005640 -0.7521778 -1.2770905 -0.3027945 0.4259382 -0.7132081 1.8774599 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.2001579 -0.2264710 -0.5685221 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.4485416 1.4860323 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 1.2387480 0.8392915 -0.5197499 -0.7521778 -1.2770905 -0.3027945 0.4101651 -1.0969046 1.6708731 med_low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 0.3961839 0.9600779 -0.4502247 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.9764743 1.0837317 med_low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.9687130 0.7540305 -0.3833114 -0.7521778 -1.2770905 -0.3027945 0.3759901 0.1858179 0.4204795 med_low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.1873487 0.0079967 -0.2447358 -0.7521778 -1.2770905 -0.3027945 0.3333809 0.0695886 0.7684151 low
-0.4872402 -1.2647715 -0.2723291 -0.5755644 2.2008652 -0.5319896 -0.2829652 -0.7521778 -1.2770905 -0.3027945 0.3938444 -1.1487176 2.9865046 low
1.4422310 -1.1219217 -0.2723291 -1.0156836 0.7078757 -0.9760573 -0.0030596 -0.5224844 -0.0607412 -1.5037485 0.4074267 -0.8364391 1.0293668 low
1.4422310 -1.1219217 -0.2723291 -1.0156836 0.3862212 -1.4023623 0.3664594 -0.5224844 -0.0607412 -1.5037485 0.2866094 -1.1333138 0.7901611 med_low
1.4422310 -1.1219217 -0.2723291 -1.0156836 1.2814456 -1.0542132 0.3664594 -0.5224844 -0.0607412 -1.5037485 0.4406159 -1.0170845 1.3446835 med_low
1.4422310 -1.1219217 -0.2723291 -1.0156836 0.9484050 -1.6723554 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.2300893 -1.0576947 1.5730162 med_low
1.4422310 -1.1219217 -0.2723291 -1.0156836 0.6466760 -1.3419691 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3618601 -1.1151092 0.8662720 low
1.4422310 -1.1219217 -0.2723291 -1.0156836 1.2714828 -1.5018334 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3704038 -1.3699732 1.5077783 med_low
2.0853880 -1.1962619 -0.2723291 -1.3263561 0.7334942 -2.0844502 1.1514203 -0.9818712 -0.8498849 -1.3189864 0.4019500 -1.0674972 0.9315099 low
2.0853880 -1.1962619 -0.2723291 -1.3263561 0.4545372 -1.7682741 1.1514203 -0.9818712 -0.8498849 -1.3189864 0.2193548 -1.1585201 0.7140502 low
2.9429307 -1.5563017 -0.2723291 -1.1451305 2.2634882 -1.2993386 0.8801579 -0.6373311 -0.9092190 -1.8732728 0.4113700 -1.3559697 2.9865046 low
2.9429307 -1.4017907 -0.2723291 -1.3004667 1.4266171 -1.2247352 1.6687754 -0.8670245 -0.4701466 -2.7047025 0.4406159 -1.2005307 1.1707156 low
2.9429307 -1.4017907 -0.2723291 -1.3004667 1.1704320 -1.1359217 1.6687754 -0.8670245 -0.4701466 -2.7047025 -0.0258945 -0.5661712 0.8445260 low
2.9429307 -1.4017907 -0.2723291 -1.3004667 1.4081148 -1.0755284 1.6687754 -0.8670245 -0.4701466 -2.7047025 0.3891344 -0.8448412 1.3120645 low
3.5860878 -1.4090789 -0.2723291 -1.3090965 0.9825630 -1.8926130 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.4406159 -1.1333138 1.3446835 low
3.5860878 -1.4090789 -0.2723291 -1.3090965 1.2102830 -1.9423486 1.8323307 -0.7521778 -0.0370076 -0.6723188 0.3026016 -1.1487176 1.1272237 low
3.0501236 -1.3274505 -0.2723291 -1.2055390 -0.1745394 -1.0719759 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4063314 -0.7314127 0.1704008 low
3.0501236 -1.3274505 -0.2723291 -1.2055390 1.8863269 -1.8784028 1.1753552 -0.8670245 -0.3574118 -1.7347012 0.4239665 -1.3363648 2.1492845 low
3.5860878 -1.2327031 -0.2723291 -1.1960462 2.2321767 -1.2567081 0.6282713 -0.6373311 -1.0931548 -1.7347012 0.3954874 -1.2383402 2.8234098 low
3.5860878 -1.2327031 -0.2723291 -1.1960462 2.4897850 -1.3028911 0.6282713 -0.6373311 -1.0931548 -1.7347012 0.3710611 -1.3685729 2.9865046 low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.5602402 -1.6439351 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4406159 -0.2496916 0.0073060 med_low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 0.0588736 -0.5710675 0.2658758 -0.6373311 -0.7786840 0.0667298 0.4183803 -0.2356881 0.2030197 med_low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.7139512 0.1465458 0.2658758 -0.6373311 -0.7786840 0.0667298 0.3587931 0.7571615 -0.0035670 med_low
-0.4872402 -0.0797012 3.6647712 -0.5669346 -0.3140179 -0.3365998 0.2109299 -0.6373311 -0.7786840 0.0667298 0.2699601 0.2810418 0.2030197 med_low
-0.4872402 -0.0797012 3.6647712 -0.5669346 -1.3387581 1.1163897 0.0379717 -0.6373311 -0.7786840 0.0667298 0.4406159 1.4615386 -0.2753917 med_high
-0.4872402 -0.0797012 3.6647712 -0.5669346 -0.4620360 0.8357389 0.0389689 -0.6373311 -0.7786840 0.0667298 0.4006356 0.6465337 -0.0905509 med_low
-0.4872402 -0.0797012 3.6647712 -0.5669346 -1.2533631 0.7114000 -0.0617572 -0.6373311 -0.7786840 0.0667298 0.4224331 1.5861699 -0.3515026 med_high
-0.4872402 -0.0797012 3.6647712 -0.5669346 -0.6797932 -0.5248845 -0.0676459 -0.6373311 -0.7786840 0.0667298 0.3753329 0.4728900 -0.0144400 med_low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 0.1286129 -1.2886809 0.0714046 -0.6373311 -0.7786840 0.0667298 0.3191414 -0.4583441 0.6053203 med_low
-0.4872402 -0.0797012 -0.2723291 -0.5669346 -1.2419771 -2.0880028 -0.0985619 -0.6373311 -0.7786840 0.0667298 -0.0848244 2.3661660 0.1269088 med_high
-0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.1460744 -0.9298742 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4047979 -0.4457409 0.2682577 med_low
-0.4872402 0.4013236 3.6647712 -0.0405174 -0.5645100 -0.4467286 -0.3243289 -0.5224844 -0.7846174 -0.9494620 0.3957065 0.1200013 0.0834169 low
-0.4872402 0.4013236 -0.2723291 -0.0405174 0.5086207 0.5870610 -0.1775851 -0.5224844 -0.7846174 -0.9494620 0.3954874 -0.4149332 0.6705582 low
-0.4872402 0.4013236 3.6647712 -0.0405174 -0.4748452 0.8961321 -0.4301365 -0.5224844 -0.7846174 -0.9494620 0.4406159 0.7375566 -0.1122969 med_low
-0.4872402 0.4013236 3.6647712 -0.0405174 0.1257664 0.8463965 -0.2050342 -0.5224844 -0.7846174 -0.9494620 0.4060028 -0.3015046 0.0507979 med_low
-0.4872402 -0.7196100 3.6647712 -0.4115983 0.9484050 0.7078474 -0.4432437 -0.1779443 -0.6006817 -0.4875567 0.3836576 -0.4121325 0.4530985 med_high
-0.4872402 -0.7196100 3.6647712 -0.4115983 -0.1716929 0.8073186 -0.3547700 -0.1779443 -0.6006817 -0.4875567 0.4224331 1.2332812 -0.0905509 med_high
-0.4872402 -0.7196100 3.6647712 -0.4115983 0.8459310 0.3241729 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.3693085 -0.3813247 0.5400824 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 0.4744627 0.4343017 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.4406159 -0.7076067 0.8227800 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 2.8199790 0.3454882 -0.4277145 -0.1779443 -0.6006817 -0.4875567 0.3108167 -1.1921285 2.4211092 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 3.4732509 0.5124576 -0.4277145 -0.1779443 -0.6006817 -0.4875567 0.2774085 -1.1235113 2.9865046 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 2.4983245 0.6367966 -0.2751294 -0.1779443 -0.6006817 -0.4875567 0.3363384 -1.3335641 1.6382541 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 1.2501340 0.4023288 -0.2751294 -0.1779443 -0.6006817 -0.4875567 0.1687496 -0.8812504 0.9858749 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 1.9944939 -1.8322198 -0.1994304 -0.1779443 -0.6006817 -0.4875567 0.2282272 -1.2229363 2.6276960 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 0.3805282 -1.6759080 -0.1994304 -0.1779443 -0.6006817 -0.4875567 0.2592256 -1.2453419 0.9750019 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 -0.4321477 -0.0168711 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2374281 -0.1404642 0.1921468 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4374877 1.6045233 0.2957526 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2132208 -1.0366895 0.9967478 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 2.9210298 0.1678611 0.0205904 -0.1779443 -0.6006817 -0.4875567 0.3202367 -1.4259873 2.0840466 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 2.7929373 0.0648374 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2440002 -1.2187352 2.8016638 med_high
-0.4872402 -0.7196100 3.6647712 -0.4115983 0.6281737 -0.0737117 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.0386218 -0.6445909 0.7031772 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 -0.2827064 -0.2513388 -0.0679783 -0.1779443 -0.6006817 -0.4875567 0.2199025 -0.2482913 0.1595278 med_high
-0.4872402 -0.7196100 3.6647712 -0.4115983 0.4929649 0.2815424 0.1676191 -0.1779443 -0.6006817 -0.4875567 0.3480587 -0.4359384 0.2791307 med_high
-0.4872402 -0.7196100 -0.2723291 -0.4115983 1.5276678 0.1074679 0.1676191 -0.1779443 -0.6006817 -0.4875567 0.3658034 -1.1095078 0.9750019 med_high
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.2794774 -1.7789317 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2490389 -0.8812504 0.1269088 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.4573837 -0.9369793 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2969057 -0.7398148 0.0834169 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.8715495 -0.5071218 1.2067460 -0.4076377 -0.6422155 -0.8570810 0.3787285 -0.1782737 -0.0579320 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 -0.2698972 -0.1234473 1.2067460 -0.4076377 -0.6422155 -0.8570810 0.4156419 -0.0354378 -0.2645187 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.1044176 -0.5568574 1.5388905 -0.4076377 -0.6422155 -0.8570810 0.1760884 -0.2006793 -0.0361860 med_low
0.7990739 -0.9047317 -0.2723291 -1.0933517 0.1542314 -2.1590536 1.5388905 -0.4076377 -0.6422155 -0.8570810 0.1975573 -1.0450916 0.1269088 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9843688 0.2815424 1.9755128 -0.2927910 -0.4642132 0.2976825 0.1732405 -0.0214342 -0.5363434 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9672898 0.0577323 1.9755128 -0.2927910 -0.4642132 0.2976825 0.3555071 0.8131756 -0.4384865 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.2513949 -1.1963149 2.0232877 -0.2927910 -0.4642132 0.2976825 0.3670082 -0.4891518 0.1921468 med_high
0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.0834514 0.3774611 2.0232877 -0.2927910 -0.4642132 0.2976825 0.2132208 -0.3505170 -0.2210268 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2111614 -0.6918540 1.9145357 -0.2927910 -0.4642132 0.2976825 0.1975573 -0.4387391 0.2138927 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.6167877 -1.8144571 1.9145357 -0.2927910 -0.4642132 0.2976825 0.4060028 -0.8532433 0.3987335 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2880169 -1.9743215 1.7104241 -0.2927910 -0.4642132 0.2976825 0.4338247 -0.9456666 0.2030197 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2182776 -2.1199757 1.7104241 -0.2927910 -0.4642132 0.2976825 0.2234076 -1.2691479 0.2465117 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 0.9569445 -2.1945790 2.4275218 -0.2927910 -0.4642132 0.2976825 0.3222084 -1.2775500 0.7684151 med_low
0.4560568 -0.7691701 -0.2723291 -1.0674624 2.8100163 -2.1377384 2.4275218 -0.2927910 -0.4642132 0.2976825 0.4406159 -1.2761497 2.2036495 med_high
2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.2513949 -1.2993386 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.3966923 -0.8518430 -0.0688050 low
2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.5815890 -1.7576164 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.4217758 -0.4765487 -0.1775348 low
3.3717021 -1.0767345 -0.2723291 -1.3867646 1.6642999 -1.2211827 1.2067460 -0.7521778 -0.9744866 -1.1804147 0.3249467 -1.3363648 2.3341253 low
0.3703025 -1.0446662 -0.2723291 0.7965722 3.4433626 0.6510068 -0.9469692 -0.5224844 -0.8558183 -2.5199404 0.3617506 -1.0548940 2.9865046 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.4920866 1.1163897 -0.9025187 -0.5224844 -0.8558183 -2.5199404 0.2915385 -0.6810000 1.4642863 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 0.7932707 1.1163897 -0.8473829 -0.5224844 -0.8558183 -2.5199404 0.3861769 -0.8056314 0.8227800 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.3070641 0.4698271 -0.7992281 -0.5224844 -0.8558183 -2.5199404 0.3957065 -0.4289367 1.2250806 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.7582344 0.7398203 -0.7860734 -0.5224844 -0.8558183 -2.5199404 0.3471824 -0.7552187 2.2362684 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 3.0078481 0.8144237 -0.7154559 -0.5224844 -0.8558183 -2.5199404 0.3306426 -0.9442662 2.8560287 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.4835471 0.9209999 -0.8150422 -0.5224844 -0.8558183 -2.5199404 0.4024977 -0.1964782 0.9206369 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.3113338 0.8179762 -0.8856597 -0.5224844 -0.8558183 -2.5199404 0.3419247 -0.6375891 1.5186513 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 -1.0313360 -0.2051558 -0.8588754 -0.5224844 -0.8558183 -2.5199404 0.3913251 -0.3085064 0.0290519 med_high
0.3703025 -1.0446662 -0.2723291 0.7965722 1.0380698 0.5692983 -0.7893502 -0.5224844 -0.8558183 -2.5199404 0.3000823 0.2992464 0.8880180 med_high
0.3703025 -1.0446662 -0.2723291 0.1752274 2.8640998 -0.0559490 -0.6522468 -0.5224844 -0.8558183 -2.5199404 0.3052304 -0.7300124 2.9865046 med_high
0.3703025 -1.0446662 -0.2723291 0.1752274 1.6870719 -0.5675150 -0.4383522 -0.5224844 -0.8558183 -2.5199404 0.3683227 -1.3293630 2.2797604 med_high
0.3703025 -0.6088285 3.6647712 -0.7826793 -0.5189660 -0.2513388 0.0581549 -0.7521778 -1.0990882 0.0667298 0.3797143 0.1396062 -0.1992808 med_low
0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.6100540 -0.9405319 0.3010658 -0.7521778 -1.0990882 0.0667298 0.3502494 0.0485834 -0.1557889 med_high
0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.0635259 -1.8570876 0.3010658 -0.7521778 -1.0990882 0.0667298 0.4406159 -0.8490423 0.2900036 med_low
0.3703025 -0.6088285 -0.2723291 -0.7826793 0.3606027 -0.3508100 0.0581549 -0.7521778 -1.0990882 0.0667298 0.4193661 -0.6894022 0.2030197 med_low
0.3703025 -0.6088285 3.6647712 -0.7826793 2.0016102 -0.5959353 0.2713846 -0.7521778 -1.0990882 0.0667298 0.3734708 -0.8504426 1.3773024 med_low
1.2278453 -0.6889993 3.6647712 -0.9293857 0.6737177 -1.2673657 0.1341862 -0.6373311 -0.9151524 -0.3951756 0.4406159 -1.2775500 1.0728588 low
1.2278453 -0.6889993 -0.2723291 -0.9293857 0.8103497 -0.9156641 0.2242746 -0.6373311 -0.9151524 -0.3951756 0.4406159 -1.3545694 1.0293668 med_low
1.2278453 -0.6889993 3.6647712 -0.9293857 1.3981521 -0.6954065 0.4711747 -0.6373311 -0.9151524 -0.3951756 0.3568215 -0.9246613 1.1598427 med_low
1.2278453 -0.6889993 3.6647712 -0.9293857 0.7704987 -1.4556504 0.5070771 -0.6373311 -0.9151524 -0.3951756 0.4028263 -1.1893278 1.1489697 low
1.2278453 -0.6889993 -0.2723291 -0.9293857 0.2809007 -1.2957860 0.1639624 -0.6373311 -0.9151524 -0.3951756 0.4406159 -0.7650212 0.7140502 low
0.3703025 -1.1379558 -0.2723291 -0.9647679 0.7505732 -1.2922335 0.1451564 -0.5224844 -1.1406221 -1.6423201 0.4406159 -1.0927035 1.3664294 med_low
0.3703025 -1.1379558 -0.2723291 -0.9647679 2.1852094 -0.1447626 0.4272465 -0.5224844 -1.1406221 -1.6423201 0.3355717 -1.2453419 2.4863472 low
0.3703025 -1.1379558 -0.2723291 -0.9647679 0.9726003 -1.1146064 0.6884411 -0.5224844 -1.1406221 -1.6423201 0.3894630 -1.1291127 1.3990484 low
0.3703025 -1.1379558 3.6647712 -0.9647679 1.9361406 -0.6705387 0.6728644 -0.5224844 -1.1406221 -1.6423201 0.2234076 -1.3503683 2.5515851 low
3.3717021 -1.4469778 3.6647712 -1.3263561 2.3318042 -1.5551216 0.9925190 -0.9818712 -1.2474235 -2.2427971 0.4255000 -1.3293630 2.9865046 low
3.3717021 -1.1904313 -0.2723291 -1.3349859 1.1433903 -1.6972232 1.6679681 -0.9818712 -0.7312167 -1.4575580 0.4167372 -0.6725979 1.0511128 low
1.8710023 -1.2953821 -0.2723291 -1.4299136 0.2396264 -1.3028911 1.6679681 -0.9818712 -0.6422155 -1.4575580 0.4167372 -0.6193846 -0.0579320 low
2.9429307 -1.3668070 -0.2723291 -1.4644327 -0.0777584 -1.3171013 2.5141909 -0.9818712 -0.9922868 -0.1180323 -0.1651137 0.0387809 -0.2645187 low
1.7638095 -0.8478833 -0.2723291 -1.2918369 -0.1076467 -1.3242064 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.4406159 -0.7720229 0.0725439 low
1.7638095 -0.8478833 -0.2723291 -1.2918369 0.0432179 -0.8161929 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.4406159 -0.7076067 -0.0253130 low
1.7638095 -0.8478833 -0.2723291 -1.2918369 0.3990304 -1.6226198 1.6726696 -0.4076377 -0.6837494 -0.8570810 0.1648063 -0.4401395 0.2465117 low
2.9429307 -0.9018164 -0.2723291 -1.2400582 0.8203125 -1.4449928 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.3055571 0.6488122 low
2.9429307 -0.9018164 -0.2723291 -1.2400582 1.2287853 -1.4520979 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.2733490 1.6056352 low
2.9429307 -0.9018164 -0.2723291 -1.2400582 0.4915417 -1.6048571 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.1137088 0.5835743 low
-0.4872402 0.4056965 -0.2723291 -1.0156836 -0.2243532 -1.7824842 0.8109650 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.5703722 0.1486548 med_low
-0.4872402 0.4056965 -0.2723291 -1.0156836 -0.3922967 -0.9334268 0.8109650 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.3155082 -0.0905509 low
-0.4872402 0.4056965 -0.2723291 -1.0156836 0.5598577 -1.3313114 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.4406159 -0.8938536 0.6596852 med_low
-0.4872402 0.4056965 -0.2723291 -1.0156836 0.3762584 -0.6243557 1.0283264 -0.6373311 -0.7074831 -1.1342242 0.3962541 -0.7370141 0.4965904 low
-0.4872402 0.4056965 -0.2723291 -1.0156836 -0.7039885 -0.3756778 1.1991001 -0.6373311 -0.7074831 -1.1342242 0.4406159 0.4462833 -0.2427728 med_low
2.5141594 -1.2968398 -0.2723291 -1.3349859 0.0859154 -1.7220910 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1266881 -1.0758993 -0.0035670 low
2.5141594 -1.2968398 -0.2723291 -1.3349859 1.0764975 -2.0808977 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.1632728 -1.1081074 0.7031772 low
2.5141594 -1.2968398 -0.2723291 -1.3349859 0.8345450 -0.7522472 1.9151531 -0.5224844 -0.2980777 -1.6885107 0.3744566 -0.9218606 0.2465117 low
0.9705825 -0.7356442 -0.2723291 -1.0502028 0.4346117 -1.0009251 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.4280193 -0.4415399 -0.0579320 low
0.9705825 -0.7356442 -0.2723291 -1.0502028 0.2994029 -1.7824842 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.2950436 -0.5577691 0.4204795 med_low
0.9705825 -0.7356442 -0.2723291 -1.0502028 0.9925258 -1.8073520 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.3697466 -1.0913032 1.1489697 med_low
0.9277053 -1.3055857 -0.2723291 -0.7136410 1.3540313 -0.9760573 0.1077818 -0.2927910 -1.1050216 -0.0256513 0.4053456 -0.8014303 1.4751593 low
0.9277053 -1.3055857 -0.2723291 -0.7136410 0.4716162 -0.3721252 -0.2018524 -0.2927910 -1.1050216 -0.0256513 0.4018404 -0.5213599 0.6379392 low
0.9277053 -1.3055857 -0.2723291 -0.7136410 1.6159094 0.1181255 -0.3304551 -0.2927910 -1.1050216 -0.0256513 0.4406159 -0.8658465 1.1815886 low
0.9277053 -1.3055857 -0.2723291 -0.7136410 0.8032335 0.0612849 -0.2908010 -0.2927910 -1.1050216 -0.0256513 0.4406159 -0.7174092 0.6161933 low
-0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4986579 0.4946949 -0.2267846 -0.6373311 -0.6184819 -0.0256513 0.4406159 -1.1361145 0.0290519 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.4449570 0.2886475 -0.3288880 -0.6373311 -0.6184819 -0.0256513 0.4333866 -0.3757233 -0.2427728 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -1.8667839 -1.0932912 -0.6058017 -0.6373311 -0.6184819 -0.0256513 -0.0681750 -0.0018293 -0.6994382 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2314694 -0.5604099 -0.5483863 -0.6373311 -0.6184819 -0.0256513 0.4406159 -0.9344638 -0.0470590 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.3723712 0.7753457 -0.4563984 -0.6373311 -0.6184819 -0.0256513 0.4340438 -0.1306617 -0.3406296 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.0265214 0.5053525 -0.2527616 -0.6373311 -0.6184819 -0.0256513 0.4021690 -0.6655962 -0.1014239 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4018769 0.6652169 -0.0915333 -0.6373311 -0.6184819 -0.0256513 0.4273621 -0.4723476 0.1377818 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.8249648 0.3241729 0.0712146 -0.6373311 -0.6184819 -0.0256513 0.4353582 -0.1614694 -0.6885653 med_low
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.5275055 0.5195627 0.0966692 -0.6373311 -0.6184819 -0.0256513 0.3727041 0.7949710 -0.5145975 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.7153745 0.1110204 0.1123884 -0.6373311 -0.6184819 -0.0256513 0.4406159 0.4602869 -0.2971377 med_low
-0.4872402 -0.1802792 -0.2723291 -0.0922961 0.1385756 -0.0488439 -0.1246813 -0.6373311 -0.6184819 -0.0256513 0.4221044 -0.3211096 0.0616709 med_high
-0.4872402 -0.1802792 -0.2723291 -0.0922961 -0.2442787 -0.3472574 0.0982364 -0.6373311 -0.6184819 -0.0256513 0.4332770 0.0107739 -0.1666618 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.2011986 -0.5781726 0.3539696 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.7636208 0.1377818 med_low
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.1300361 -0.5071218 0.3539696 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.8098324 0.0616709 med_low
-0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.3467527 -0.6634336 0.4397839 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.6936032 -0.2318998 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.8206950 0.2033865 0.4397839 -0.5224844 -0.7193499 0.5286352 0.3774141 -0.1278610 -0.4384865 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.1855429 -1.0115827 0.4397839 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.9148588 0.2682577 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.2083149 -1.9139283 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4053456 -1.0604955 0.2247657 med_low
-0.4872402 -0.5476072 -0.2723291 -0.5324154 0.0389481 -1.4094674 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 -0.9106578 0.0507979 med_high
-0.4872402 -0.5476072 -0.2723291 -0.5324154 -0.2869762 -0.8836912 0.7697438 -0.5224844 -0.7193499 0.5286352 0.4406159 0.0191760 -0.0361860 med_low
-0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.5929750 -1.5195961 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2822280 -0.3757233 -0.3515026 low
-0.4872402 -1.1510747 -0.2723291 -0.8171985 0.0688364 -1.8251147 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2030341 -0.7440159 0.0073060 low
-0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.2001579 -1.2922335 0.9871052 -0.6373311 0.1291279 -0.7185093 0.1303027 -0.4989543 -0.2971377 low
1.0134596 -0.7400171 -0.2723291 -1.0079168 -0.8235415 -1.4272301 1.3514003 -0.9818712 -0.6184819 -0.7185093 0.4090698 -0.0312367 -0.5907084 low
1.0134596 -0.7400171 -0.2723291 -1.0079168 -0.3609852 -1.6084097 1.3514003 -0.9818712 -0.6184819 -0.7185093 0.0610765 -0.6753986 -0.3406296 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 0.0446411 -1.0826335 1.2648262 -0.5224844 -1.0931548 0.8057784 0.3618601 -0.9764743 -0.0361860 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 0.0361016 -1.0684234 1.2648262 -0.5224844 -1.0931548 0.8057784 0.3584645 -0.8266367 -0.1992808 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.3524457 -1.2105250 1.0401514 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.6501923 -0.1557889 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5915517 -0.7913251 0.6819824 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.3995293 -0.3297567 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5545472 -0.3188371 0.8642962 -0.5224844 -1.0931548 0.8057784 0.4177230 -0.2931025 -0.4384865 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.3211342 -1.1110539 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4322912 -0.5801747 -0.2101538 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4264547 -0.8232980 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4079314 -0.3841216 low
-0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4506500 -0.3579151 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4709472 -0.4167406 low
1.0134596 -1.4017907 -0.2723291 -0.9725347 1.3611476 -0.6847489 1.5400303 -0.9818712 -0.7371501 -1.3651769 0.4169563 -1.0030810 1.1054777 low
-0.4872402 -1.3478576 -0.2723291 -0.3166707 0.3634492 -0.3152846 1.1738830 -0.9818712 0.0816606 -1.1804147 0.3645985 -0.5605698 -0.6559463 low
1.8710023 -1.0723615 -0.2723291 -0.6100835 0.5854762 -0.4325184 0.9199069 -0.5224844 -0.2268768 -0.3951756 0.4406159 -0.7664215 0.1486548 low
1.8710023 -1.0723615 -0.2723291 -0.6100835 0.8388147 -1.4378877 1.2681505 -0.5224844 -0.2268768 -0.3951756 0.3428010 -1.1263120 0.9423829 low
-0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.3851804 -0.7131692 2.0033894 -0.7521778 -0.3336782 0.1591109 0.3172793 -0.2973036 -0.5472164 low
-0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.5502775 -0.5781726 2.0033894 -0.7521778 -0.3336782 0.1591109 0.0869268 0.0023717 -0.5798354 low
3.1573164 -1.0184285 -0.2723291 -1.0847220 0.3292912 -1.4520979 2.2511443 -0.6373311 -0.3396116 -0.2566040 0.3916537 -0.8812504 0.0616709 low
2.9429307 -1.3303658 -0.2723291 -1.0329432 0.4986579 -1.3810470 2.1602961 -0.6373311 -0.7608838 -0.6723188 0.3753329 -0.9330634 0.2138927 low
1.2278453 -1.4411472 -0.2723291 -1.0847220 0.9313260 -1.2105250 2.3730984 -0.9818712 -0.4345461 0.5748257 0.3633936 -0.9470669 0.4422255 low
1.2278453 -1.4411472 -0.2723291 -1.0847220 0.2922867 -0.8588234 2.3730984 -0.9818712 -0.4345461 0.5748257 0.4406159 -0.9344638 0.0399249 low
2.0853880 -1.3770106 -0.2723291 -1.2400582 0.4189559 -1.1607894 3.2840500 -0.6373311 0.0163931 -0.0718418 0.1545100 -1.0030810 0.1704008 low
2.0853880 -1.3770106 -0.2723291 -1.2400582 -0.5702030 -1.7789317 3.2840500 -0.6373311 0.0163931 -0.0718418 0.3905583 -0.6810000 -0.4276135 low
3.3717021 -1.3289081 -0.2723291 -1.2486880 0.6310202 -1.1536844 3.9566022 -0.5224844 -1.3126910 -0.6723188 0.3043541 -1.1417159 0.8227800 low
2.9429307 -1.3449423 -0.2723291 -1.2227986 -0.8847413 -1.6581453 3.2248775 -0.6373311 -0.4404796 1.6372081 0.2861713 -0.6445909 -0.4711055 low
2.9429307 -1.3449423 -0.2723291 -1.2227986 -0.4961940 -1.7434063 3.2248775 -0.6373311 -0.4404796 1.6372081 0.2121255 -0.9918782 -0.2101538 med_low
-0.4872402 1.0149946 3.6647712 1.8580364 -0.1033769 1.0240236 -0.7944316 1.6596029 1.5294129 0.8057784 0.2306369 0.6927453 -0.5145975 high
-0.4872402 1.0149946 3.6647712 1.8580364 0.1570779 0.7966610 -0.6125452 1.6596029 1.5294129 0.8057784 0.3797143 0.0863929 -0.0905509 high
-0.4872402 1.0149946 3.6647712 1.8580364 -0.2243532 0.5266678 -0.5092547 1.6596029 1.5294129 0.8057784 0.4245142 -0.1642701 0.0181789 high
-0.4872402 1.0149946 -0.2723291 1.8580364 -0.2457019 0.4520644 -0.6106931 1.6596029 1.5294129 0.8057784 0.3731422 0.0023717 0.0073060 high
-0.4872402 1.0149946 -0.2723291 1.8580364 0.1613476 0.6900847 -0.6063715 1.6596029 1.5294129 0.8057784 0.1959143 -0.6810000 0.2682577 high
-0.4872402 1.0149946 -0.2723291 1.8580364 -0.0478701 0.8002135 -0.7121316 1.6596029 1.5294129 0.8057784 -0.0659843 0.2152253 -0.2862647 high
-0.4872402 1.0149946 -0.2723291 1.8580364 -1.3131396 0.9813931 -0.8032647 1.6596029 1.5294129 0.8057784 0.2641547 -0.3449156 -0.1884078 high
-0.4872402 1.0149946 3.6647712 1.8580364 -0.6854862 0.7256101 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0398054 0.2782411 -0.6233273 high
-0.4872402 1.0149946 3.6647712 1.4092873 3.5515296 0.5089051 -0.8977222 1.6596029 1.5294129 0.8057784 -0.0232656 -1.0310881 -0.0688050 med_high
-0.4872402 1.0149946 -0.2723291 1.4092873 -3.8764132 0.6865322 -1.0361553 1.6596029 1.5294129 0.8057784 -0.0216226 -0.7748236 0.5400824 high
-0.4872402 1.0149946 -0.2723291 1.4092873 -1.8810164 0.8108711 -0.9700968 1.6596029 1.5294129 0.8057784 -0.4451952 0.1886186 -0.0688050 high
-0.4872402 1.0149946 -0.2723291 0.6584956 -3.4465917 1.1163897 -1.0848799 1.6596029 1.5294129 0.8057784 -2.4673242 0.0947950 0.0616709 high
-0.4872402 1.0149946 -0.2723291 0.6584956 -1.8710537 1.1163897 -1.1694595 1.6596029 1.5294129 0.8057784 0.2064297 -1.3153595 2.9865046 high
-0.4872402 1.0149946 3.6647712 0.6584956 0.5669739 1.0027084 -1.1579669 1.6596029 1.5294129 0.8057784 0.2043485 -1.2495430 2.9865046 high
-0.4872402 1.0149946 3.6647712 0.6584956 1.0409163 1.0275762 -1.2312439 1.6596029 1.5294129 0.8057784 0.3874913 -1.3573701 2.9865046 high
-0.4872402 1.0149946 -0.2723291 0.6584956 -0.0976839 1.1163897 -1.2470580 1.6596029 1.5294129 0.8057784 0.1037952 -0.4373388 2.9865046 high
-0.4872402 1.0149946 3.6647712 0.9777978 -0.5830122 0.7469254 -1.2658165 1.6596029 1.5294129 0.8057784 -0.0963256 -0.5283617 2.9865046 high
-0.4872402 1.0149946 -0.2723291 0.9777978 -1.9621417 1.1163897 -1.2446360 1.6596029 1.5294129 0.8057784 0.4406159 3.0971497 -0.9495170 high
-0.4872402 1.0149946 -0.2723291 0.9777978 -3.0551979 1.1163897 -1.2623023 1.6596029 1.5294129 0.8057784 0.4406159 3.5452624 -0.9495170 high
-0.4872402 1.0149946 -0.2723291 1.0036872 1.4636216 1.0417863 -1.1771529 1.6596029 1.5294129 0.8057784 0.4406159 0.1101988 -0.8190411 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.5185834 0.8783694 -1.1635707 1.6596029 1.5294129 0.8057784 0.0695107 1.4825438 -0.9386440 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.7249547 1.0737592 -1.1573496 1.6596029 1.5294129 0.8057784 0.4406159 1.2024734 -1.0038819 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.1357291 0.9813931 -1.1440049 1.6596029 1.5294129 0.8057784 0.4406159 1.5455597 -1.0256279 high
-0.4872402 1.0149946 -0.2723291 1.0036872 -0.0877212 1.1163897 -1.1440049 1.6596029 1.5294129 0.8057784 0.4060028 1.2780924 -1.3409445 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.9726003 0.8286338 -1.1295680 1.6596029 1.5294129 0.8057784 0.4406159 0.6381316 -1.3191985 high
-0.4872402 1.0149946 -0.2723291 1.0036872 0.3705654 1.0844168 -1.0807958 1.6596029 1.5294129 0.8057784 0.4406159 1.1800678 -1.2648336 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.0654941 1.1163897 -1.0517320 1.6596029 1.5294129 0.8057784 0.4406159 1.5329565 -1.2213417 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.0882661 1.1163897 -1.0741947 1.6596029 1.5294129 0.8057784 0.4406159 1.6673903 -1.1126118 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -2.7278503 0.8037660 -1.1186453 1.6596029 1.5294129 0.8057784 -0.7759914 2.5174040 -1.4931663 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.4341159 1.0488914 -1.1250089 1.6596029 1.5294129 0.8057784 0.4406159 2.5426103 -1.6671342 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -2.3236472 1.1163897 -1.1054906 1.6596029 1.5294129 0.8057784 0.4406159 2.1883213 -1.3083256 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.8283562 0.7433728 -1.0811757 1.6596029 1.5294129 0.8057784 0.4406159 2.7078519 -1.6453882 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.9991462 1.1163897 -1.0474104 1.6596029 1.5294129 0.8057784 0.1779505 2.5160036 -1.3409445 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.2732886 1.0773117 -0.9815894 1.6596029 1.5294129 0.8057784 0.4406159 1.1478597 -1.1995957 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -0.8135788 1.0098134 -0.8873694 1.6596029 1.5294129 0.8057784 0.4135607 0.6241280 -0.8081681 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -0.3325202 0.4946949 -0.7727762 1.6596029 1.5294129 0.8057784 0.2377567 0.8551861 0.0725439 high
-0.4872402 1.0149946 -0.2723291 1.2539511 -1.7771192 1.0098134 -0.9616911 1.6596029 1.5294129 0.8057784 0.4406159 1.8242297 -1.3953095 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.1304187 0.8535016 -0.9516232 1.6596029 1.5294129 0.8057784 0.4406159 0.3524598 -0.9495170 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.5659332 0.9281050 -0.9559448 1.6596029 1.5294129 0.8057784 0.4406159 0.5177013 -1.0691198 high
-0.4872402 1.0149946 -0.2723291 1.1935426 0.2652449 1.0737592 -0.9827291 1.6596029 1.5294129 0.8057784 0.3867246 0.6255284 -1.0256279 high
-0.4872402 1.0149946 -0.2723291 1.1935426 0.1713104 0.9742880 -1.0059517 1.6596029 1.5294129 0.8057784 0.4406159 0.9406076 -1.0908658 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.7651883 1.0773117 -1.0265623 1.6596029 1.5294129 0.8057784 0.3989925 1.0176270 -1.5257853 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -1.1836238 1.1163897 -1.0948528 1.6596029 1.5294129 0.8057784 0.4406159 2.5118026 -1.9063399 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.6157470 0.3277255 -1.0897239 1.6596029 1.5294129 0.8057784 -0.2027938 2.4249808 -1.7649910 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.4236082 1.1163897 -1.0477428 1.6596029 1.5294129 0.8057784 0.4406159 1.9768681 -1.8411020 high
-0.4872402 1.0149946 -0.2723291 1.1935426 0.0830689 1.1163897 -1.0547238 1.6596029 1.5294129 0.8057784 0.4406159 1.0736410 -1.6671342 high
-0.4872402 1.0149946 -0.2723291 1.1935426 0.1698871 1.1163897 -1.0239029 1.6596029 1.5294129 0.8057784 0.2128922 1.0722407 -1.1343578 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -1.3316418 0.9742880 -0.9936043 1.6596029 1.5294129 0.8057784 0.4406159 0.9966217 -1.5475313 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -1.0726103 0.5977186 -1.0389097 1.6596029 1.5294129 0.8057784 -0.2980894 2.0622896 -1.5257853 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -0.8562763 1.1163897 -1.1253414 1.6596029 1.5294129 0.8057784 0.3099404 1.4461347 -1.9063399 high
-0.4872402 1.0149946 -0.2723291 0.9001297 -3.0551979 1.1163897 -1.2427839 1.6596029 1.5294129 0.8057784 0.1483760 1.4965474 -1.1561037 high
-0.4872402 1.0149946 -0.2723291 0.9001297 -0.9630200 1.1163897 -1.1919222 1.6596029 1.5294129 0.8057784 -0.2692816 -0.0732473 0.5835743 high
-0.4872402 1.0149946 -0.2723291 0.3650828 -0.9502108 1.0417863 -1.1114268 1.6596029 1.5294129 0.8057784 -0.4604205 1.9250551 -0.5798354 high
-0.4872402 1.0149946 -0.2723291 0.3650828 0.8075032 1.1163897 -1.1062979 1.6596029 1.5294129 0.8057784 -1.9422126 0.9980220 0.5400824 high
-0.4872402 1.0149946 -0.2723291 0.3650828 -0.7509558 1.1163897 -1.1312301 1.6596029 1.5294129 0.8057784 -3.8783565 -0.3561184 -0.8190411 high
-0.4872402 1.0149946 -0.2723291 0.3650828 0.5299694 1.1163897 -1.0768541 1.6596029 1.5294129 0.8057784 -3.5229148 1.1996727 -0.5798354 high
-0.4872402 1.0149946 -0.2723291 0.3650828 -2.3578052 1.1163897 -1.0643168 1.6596029 1.5294129 0.8057784 -3.5914839 3.0411357 -0.5037245 high
-0.4872402 1.0149946 -0.2723291 0.3650828 -1.6077524 1.1163897 -1.0474579 1.6596029 1.5294129 0.8057784 -1.5959718 1.0400326 -0.6776923 high
-0.4872402 1.0149946 -0.2723291 1.1935426 -2.5129395 1.1163897 -1.0147848 1.6596029 1.5294129 0.8057784 -2.9399686 3.4066275 -1.6888801 high
-0.4872402 1.0149946 -0.2723291 1.0727255 0.2125846 1.1163897 -0.9309651 1.6596029 1.5294129 0.8057784 -3.6083523 2.2961484 -1.6671342 high
-0.4872402 1.0149946 -0.2723291 1.0727255 0.7078757 0.7895559 -0.9381836 1.6596029 1.5294129 0.8057784 -3.6705683 1.8396336 -1.6345152 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -1.3956881 0.7291627 -1.0198662 1.6596029 1.5294129 0.8057784 -2.5117955 1.9586635 -1.3191985 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -0.4663057 1.1163897 -0.9462094 1.6596029 1.5294129 0.8057784 -3.7266503 1.1156516 -1.4931663 high
-0.4872402 1.0149946 -0.2723291 1.4092873 0.7676522 0.2815424 -0.9502935 1.6596029 1.5294129 0.8057784 -3.3761377 1.4125262 -1.5366583 high
-0.4872402 1.0149946 -0.2723291 1.4092873 0.1798499 1.1163897 -0.9194726 1.6596029 1.5294129 0.8057784 -0.4154016 0.3314545 -0.6342003 high
-0.4872402 1.0149946 -0.2723291 1.4092873 -0.3965665 0.9494202 -0.9120166 1.6596029 1.5294129 0.8057784 -0.4019288 0.4266784 -0.9060250 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -0.9060900 0.6758745 -0.8756394 1.6596029 1.5294129 0.8057784 -0.7133373 0.2026221 -0.1884078 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -0.2585112 0.5870610 -0.8421115 1.6596029 1.5294129 0.8057784 -3.8792328 1.4895456 -0.9930089 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -1.0242198 0.0719425 -0.8223082 1.6596029 1.5294129 0.8057784 -3.8668553 0.6311298 -1.1778497 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -0.5531240 0.9529728 -0.8953952 1.6596029 1.5294129 0.8057784 -3.8227126 1.6435843 -1.5475313 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -0.6370957 -0.3152846 -0.8536040 1.6596029 1.5294129 0.8057784 -3.6368314 0.4252781 -1.3409445 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -0.1176094 0.3596983 -0.9175730 1.6596029 1.5294129 0.8057784 -3.7006904 0.2614369 -1.2648336 high
-0.4872402 1.0149946 -0.2723291 1.0727255 -0.1304187 0.3383831 -0.8830478 1.6596029 1.5294129 0.8057784 -2.8473018 1.2416833 -1.2539606 high
-0.4872402 1.0149946 -0.2723291 1.0727255 0.1357291 0.9600779 -0.8675661 1.6596029 1.5294129 0.8057784 -3.2417380 1.6001734 -1.4170554 high
-0.4872402 1.0149946 -0.2723291 0.2528955 0.0901851 0.6225864 -0.8274371 1.6596029 1.5294129 0.8057784 -2.9927645 0.6983467 -0.8734060 high
-0.4872402 1.0149946 -0.2723291 0.2528955 0.7804615 0.9138948 -0.8105782 1.6596029 1.5294129 0.8057784 -3.0159860 0.9854189 -0.9168980 high
-0.4872402 1.0149946 -0.2723291 0.2528955 0.1997754 0.2211492 -0.7572945 1.6596029 1.5294129 0.8057784 -2.8339385 -0.0872508 -0.6994382 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.2154311 0.6865322 -0.7024911 1.6596029 1.5294129 0.8057784 -2.8094026 0.4994967 -0.8951520 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.1090699 0.9387626 -0.7469417 1.6596029 1.5294129 0.8057784 -2.8045831 0.3524598 -1.1778497 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.4901184 0.9245525 -0.7932444 1.6596029 1.5294129 0.8057784 -2.7035916 1.4867449 -0.9930089 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.2510124 0.8783694 -0.8512296 1.6596029 1.5294129 0.8057784 -3.6057234 0.7557611 -1.4061824 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.1887719 1.1163897 -0.8932106 1.6596029 1.5294129 0.8057784 -3.8047489 1.9320568 -1.5040393 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.4976172 0.6865322 -0.9376612 1.6596029 1.5294129 0.8057784 -3.1515905 2.9921233 -1.5366583 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.9359783 0.8996847 -0.9392759 1.6596029 1.5294129 0.8057784 0.4406159 1.4321312 -1.0582468 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.6641375 0.8463965 -0.9160058 1.6596029 1.5294129 0.8057784 0.3809192 1.3243041 -1.3083256 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.1727336 1.0169185 -0.8215484 1.6596029 1.5294129 0.8057784 0.3207844 0.9616129 -0.5907084 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.0934142 1.1163897 -0.8501848 1.6596029 1.5294129 0.8057784 0.4273621 0.5513097 -0.4493595 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.2851704 1.1163897 -0.8627221 1.6596029 1.5294129 0.8057784 0.3292186 0.8677893 -0.7755492 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.6129005 0.9956033 -0.9020438 1.6596029 1.5294129 0.8057784 -1.2722954 1.5595632 -1.2757066 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.2481659 0.9316575 -0.8582106 1.6596029 1.5294129 0.8057784 -3.4351771 1.5861699 -1.1669767 high
-0.4872402 1.0149946 -0.2723291 1.5991427 0.0802224 0.9884982 -0.8182715 1.6596029 1.5294129 0.8057784 -0.4235072 0.7193520 -0.8299141 high
-0.4872402 1.0149946 -0.2723291 1.5991427 -0.0478701 0.9956033 -0.7584343 1.6596029 1.5294129 0.8057784 0.3488254 0.5303045 -1.0799928 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.1418047 1.0702067 -0.7282307 1.6596029 1.5294129 0.8057784 0.4406159 0.7669640 -0.9168980 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.1883894 1.0559965 -0.7646079 1.6596029 1.5294129 0.8057784 -0.5746657 0.9322055 -1.0365009 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.6609085 0.8535016 -0.6987869 1.6596029 1.5294129 0.8057784 -3.9033305 0.6703397 -0.9930089 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.5271229 1.0524439 -0.6837801 1.6596029 1.5294129 0.8057784 -0.0151600 0.7109499 -0.7972951 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.0175994 0.8250813 -0.6776064 1.6596029 1.5294129 0.8057784 0.3112548 0.6465337 -0.6994382 high
-0.4872402 1.0149946 -0.2723291 1.3661384 1.5774816 1.0915219 -0.6374774 1.6596029 1.5294129 0.8057784 0.2102634 0.5723150 -0.5145975 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.6310202 0.9067897 -0.6168668 1.6596029 1.5294129 0.8057784 -3.8336662 0.8481844 -0.8299141 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.3421004 0.6367966 -0.6455032 1.6596029 1.5294129 0.8057784 -3.3490825 0.7669640 -0.9168980 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.4392640 0.6865322 -0.5767378 1.6596029 1.5294129 0.8057784 -3.7920428 0.8901949 -1.0691198 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.4961940 0.4165390 -0.4824229 1.6596029 1.5294129 0.8057784 -3.8684983 0.6003221 -0.9821359 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.0232924 0.5373254 -0.4805707 1.6596029 1.5294129 0.8057784 -0.9251783 0.5008971 -0.8299141 high
-0.4872402 1.0149946 -0.2723291 1.3661384 -0.2898227 0.5621932 -0.5117241 1.6596029 1.5294129 0.8057784 0.4406159 0.2866432 -0.2753917 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.5925924 0.7611355 -0.5687120 1.6596029 1.5294129 0.8057784 -1.1111691 0.5275038 -0.6668193 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.1300361 0.7042949 -0.5831490 1.6596029 1.5294129 0.8057784 0.3807001 0.2796414 -0.5254704 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.0460644 0.5124576 -0.5036983 1.6596029 1.5294129 0.8057784 0.4406159 0.1872182 -0.3297567 high
-0.4872402 1.0149946 -0.2723291 1.3661384 0.3250214 0.7575830 -0.4717851 1.6596029 1.5294129 0.8057784 0.4068791 -0.3309120 -0.2536457 high
-0.4872402 1.0149946 -0.2723291 0.8656106 -0.1076467 -0.1127897 -0.3949464 1.6596029 1.5294129 0.8057784 0.4406159 0.0793911 -0.1231699 high
-0.4872402 1.0149946 -0.2723291 0.8656106 -0.7481093 -0.7238268 -0.3459843 1.6596029 1.5294129 0.8057784 -0.2439790 0.2068231 -0.2862647 med_high
-0.4872402 1.0149946 -0.2723291 0.8656106 -0.4734220 0.5728508 -0.4385897 1.6596029 1.5294129 0.8057784 -3.6657487 0.6297295 -0.3841216 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -0.4008362 0.9209999 -0.5958763 1.6596029 1.5294129 0.8057784 -0.2780445 1.2136763 -0.3732486 high
-0.4872402 1.0149946 -0.2723291 0.2183763 -0.5104265 0.0861526 -0.4210659 1.6596029 1.5294129 0.8057784 0.1321648 0.7669640 -0.3732486 high
-0.4872402 1.0149946 -0.2723291 0.2183763 -0.8135788 -0.4218608 -0.4612898 1.6596029 1.5294129 0.8057784 0.4406159 0.2950453 -0.2645187 high
-0.4872402 1.0149946 -0.2723291 0.2183763 -0.1674232 0.5479830 -0.3617035 1.6596029 1.5294129 0.8057784 0.4406159 0.5092992 -0.2862647 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 -0.0791817 0.7860033 -0.3304076 1.6596029 1.5294129 0.8057784 0.4234189 0.0303788 -0.3188837 high
-0.4872402 1.0149946 -0.2723291 0.2183763 0.2168544 0.2282543 -0.4267172 1.6596029 1.5294129 0.8057784 0.4019500 0.2390312 0.0725439 med_high
-0.4872402 1.0149946 -0.2723291 0.5117892 0.9896793 -0.0346338 -0.5993905 1.6596029 1.5294129 0.8057784 0.1972287 -0.1390638 0.7901611 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -1.2206283 0.9529728 -0.6483526 1.6596029 1.5294129 0.8057784 -0.0448441 0.7683643 -0.9495170 high
-0.4872402 1.0149946 -0.2723291 0.2528955 -0.1745394 1.0240236 -0.7546351 1.6596029 1.5294129 0.8057784 -0.5905484 1.6029741 -1.0038819 high
-0.4872402 1.0149946 -0.2723291 0.5117892 0.2837472 0.8890270 -0.7074776 1.6596029 1.5294129 0.8057784 0.4330580 0.8439833 -0.6342003 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -1.3956881 1.0204711 -0.8046419 1.6596029 1.5294129 0.8057784 -0.0788000 1.7164026 -1.1452307 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -0.1418047 0.9991558 -0.7714940 1.6596029 1.5294129 0.8057784 0.2522154 0.7529604 -0.8625331 high
-0.4872402 1.0149946 -0.2723291 0.5117892 -0.0791817 0.6900847 -0.8756394 1.6596029 1.5294129 0.8057784 0.2918671 0.0639872 -0.1231699 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 -0.0606794 -0.1376575 -0.1761129 1.6596029 1.5294129 0.8057784 0.4406159 -0.2678962 0.0507979 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 0.6623317 0.2247018 -0.2200411 1.6596029 1.5294129 0.8057784 0.3986639 -0.6880018 0.1269088 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 1.1049625 0.2993051 -0.1825715 1.6596029 1.5294129 0.8057784 0.4228712 -0.7902275 0.2682577 high
-0.4872402 1.0149946 -0.2723291 -0.1958536 -0.7438395 -1.0044776 0.1440166 1.6596029 1.5294129 0.8057784 0.3970209 -0.3127075 -0.0796779 med_high
-0.4872402 1.0149946 -0.2723291 0.2442657 -0.5887052 -0.9476370 -0.0337381 1.6596029 1.5294129 0.8057784 0.1539623 0.0961953 -0.2101538 med_high
-0.4872402 1.0149946 -0.2723291 0.2442657 0.0389481 -0.5923828 0.0933924 1.6596029 1.5294129 0.8057784 0.3499208 -0.2903018 -0.1449159 med_high
-0.4872402 1.0149946 -0.2723291 0.2442657 -0.2428554 0.3987763 -0.1183177 1.6596029 1.5294129 0.8057784 0.3943920 0.3258531 -0.3732486 high
-0.4872402 1.0149946 -0.2723291 0.2442657 -0.5403147 -0.5461998 -0.3052380 1.6596029 1.5294129 0.8057784 0.3455394 -0.1684712 -0.2101538 high
-0.4872402 2.4201701 -0.2723291 0.4686402 -1.1822006 0.8570542 -0.9375187 -0.6373311 1.7964164 0.7595879 0.4207900 0.7571615 -0.7972951 med_low
-0.4872402 2.4201701 -0.2723291 0.4686402 -1.2391306 1.0559965 -0.9686246 -0.6373311 1.7964164 0.7595879 -0.1382776 1.5847695 -1.6888801 med_low
-0.4872402 2.4201701 -0.2723291 0.4686402 -1.6959939 1.0453389 -0.9367114 -0.6373311 1.7964164 0.7595879 -0.4189067 2.3843705 -1.5692773 med_low
-0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 1.0737592 -0.9151035 -0.6373311 1.7964164 0.7595879 0.3662415 0.7585618 -0.9712629 med_low
-0.4872402 2.4201701 -0.2723291 0.4686402 -0.4293012 0.5302203 -0.8002729 -0.6373311 1.7964164 0.7595879 0.4406159 0.0975957 -0.2645187 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.8221183 -0.5177794 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 -0.0900515 -0.0796779 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.5104265 -0.9227692 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 0.1312041 0.2138927 med_high
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.8747785 -1.4130199 -0.4732098 -0.4076377 -0.1022751 0.3438730 0.4010737 0.6927453 0.0616709 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -1.2732886 0.1536509 -0.4732098 -0.4076377 -0.1022751 0.3438730 0.4406159 1.1884699 -0.3080107 med_high
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.6982955 0.0719425 -0.4285218 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2026221 -0.4602325 med_high
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3780642 -0.1163422 -0.6581830 -0.4076377 -0.1022751 0.3438730 0.4406159 0.0373805 -0.1449159 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -1.0185268 0.1749662 -0.6625521 -0.4076377 -0.1022751 0.3438730 0.4282384 0.3426573 -0.5472164 med_low
-0.4872402 -0.2108898 -0.2723291 0.2615253 -0.3666782 0.3952238 -0.6158695 -0.4076377 -0.1022751 0.3438730 0.4406159 0.2348302 -0.6233273 med_low
-0.4872402 0.1156240 -0.2723291 0.1579678 0.4388814 0.0186544 -0.6251775 -0.9818712 -0.8024176 1.1753027 0.3868341 -0.4177339 -0.0144400 low
-0.4872402 0.1156240 -0.2723291 0.1579678 -0.2343159 0.2886475 -0.7159308 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.5003546 -0.2101538 low
-0.4872402 0.1156240 -0.2723291 0.1579678 0.9839863 0.7966610 -0.7729187 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.9820757 0.1486548 low
-0.4872402 0.1156240 -0.2723291 0.1579678 0.7249547 0.7362677 -0.6677760 -0.9818712 -0.8024176 1.1753027 0.4028263 -0.8644462 -0.0579320 med_low
-0.4872402 0.1156240 -0.2723291 0.1579678 -0.3624084 0.4343017 -0.6126402 -0.9818712 -0.8024176 1.1753027 0.4406159 -0.6683969 -1.1561037 low

We checked the scaled crime rate values and defined the quantiles of the crime data, which where then named as low (0-25%), medium low (25-50%), medium high(50-75%) and high (75-100%).Then the crime rate quantiles definintions were added instead of the crim values as “crime” variable into the ‘boston_scaled’ data set.

Quantiles:
0% - 25% -50% - 75% -100% of the data distribution –> quantiles give the values how many values are in which part of the data distribution. The median is the center of the quantiles (were the upper and lower 50 % quantiles meet).


1.3.3. Define training and test data set

We are going to use 80% of the observations (the rows) to train the linear discriminant analysis model.
Therefore we prepare a training data set ‘boston_train’ including 80% of the rows and including the “crime” variable.
The test data set ‘boston_test’ will include 20% of the rows and we remove the “crime” variable. With the linear discriminant model we later on predict the crime rate of the test data set.

# number of rows in the Boston dataset 
n_boston <- nrow(boston_scaled)

# choose randomly 80% of the rows
ind_boston <- sample(n_boston,  size = n_boston * 0.8)

# create train set (crime variable stays in the data set)
boston_train <- boston_scaled[ind_boston,]

# create test set 
boston_test <- boston_scaled[-ind_boston,]

knitr::kable(boston_test, caption = "Test data set with the crime variable") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center",) %>% 
  scroll_box(width = "100%", height = "300px") # the data frame head
Test data set with the crime variable
zn indus chas nox rm age dis rad tax ptratio black lstat medv crime
4 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.0152978 -0.8090878 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4157514 -1.3601708 1.1815886 low
7 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3880270 -0.0701592 0.8384142 -0.5224844 -0.5769480 -1.5037485 0.4263763 -0.0312367 0.0399249 med_low
19 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.1793541 -1.1359217 0.0006921 -0.6373311 -0.6006817 1.1753027 -0.7413783 -0.1348628 -0.2536457 med_high
20 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.7936533 0.0328645 0.0006921 -0.6373311 -0.6006817 1.1753027 0.3754425 -0.1922772 -0.4711055 med_high
24 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 1.1163897 0.1425445 -0.6373311 -0.6006817 1.1753027 0.4147656 1.0120256 -0.8734060 med_high
31 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8135788 0.9067897 0.2079856 -0.6373311 -0.6006817 1.1753027 0.0382932 1.3929213 -1.0691198 med_high
36 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.5004637 -0.0133185 -0.2064589 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.4163335 -0.3949946 low
37 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6314027 -0.2548913 -0.1981007 -0.5224844 -0.7668172 0.3438730 0.2287748 -0.1740726 -0.2753917 med_low
40 2.7285450 -1.1933466 -0.2723291 -1.0933517 0.4417279 -1.6616978 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4267049 -1.1669222 0.8988910 low
43 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1645767 -2.2016841 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.2924148 -0.9582698 0.3008766 med_low
47 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.7096815 -1.2531555 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.2096238 -0.2753917 med_low
52 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.2414322 -0.1980507 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4085221 -0.4513423 -0.2210268 low
54 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4079525 -1.6759080 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -0.5913775 0.0942899 low
56 3.3717021 -1.4455202 -0.2723291 -1.3090965 1.3725336 -1.6581453 2.3277455 -0.5224844 -1.0812880 -0.2566040 0.4299910 -1.0983050 1.3990484 low
62 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.4534965 0.8819220 1.4358374 -0.1779443 -0.7371501 0.5748257 0.2344707 0.2502341 -0.7103112 med_low
63 0.5846882 -0.8755787 -0.2723291 -0.8776070 0.2438961 -0.0275287 1.6291213 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.8294374 -0.0361860 med_low
67 2.9429307 -1.1321252 -0.2723291 -1.3522454 -0.7082582 -1.3313114 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -0.3379138 -0.3406296 low
70 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5687797 -1.2638131 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 -0.5409648 -0.1775348 med_low
76 -0.4872402 0.2468126 -0.2723291 -1.0156836 0.0019436 -0.8375082 0.3360184 -0.5224844 -0.0607412 0.1129203 0.2908813 -0.5199596 -0.1231699 med_low
85 -0.4872402 -0.9688683 -0.2723291 -0.9121262 0.1485384 -0.7309319 0.4674705 -0.7521778 -0.9566863 0.0205393 0.4406159 -0.4247356 0.1486548 low
94 0.7133196 0.5689534 -0.2723291 -0.7826793 -0.1048002 -1.4094674 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4343724 -0.9022557 0.2682577 low
105 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1674232 0.7611355 -0.6525317 -0.5224844 -0.1438090 1.1291122 0.3945016 -0.0452402 -0.2645187 med_low
107 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6385190 0.8286338 -0.7522606 -0.5224844 -0.1438090 1.1291122 0.4271431 0.8411826 -0.3297567 med_low
111 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1275722 -0.5035693 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4032644 0.0485834 -0.0905509 med_low
116 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5075800 0.6971898 -0.6325385 -0.4076377 0.1409947 -0.3027945 -0.1288575 0.4350805 -0.4602325 med_low
118 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.3752177 0.4982475 -0.4975246 -0.4076377 0.1409947 -0.3027945 0.4144370 -0.3295117 -0.3623756 med_low
128 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.8420438 0.9742880 -0.9530004 -0.6373311 0.1706618 1.2676838 0.3881485 0.6353309 -0.6885653 med_high
133 -0.4872402 1.5674443 -0.2723291 0.5980871 0.1243431 1.0417863 -0.6969823 -0.6373311 0.1706618 1.2676838 0.3185937 -0.2146828 0.0507979 med_high
149 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.5636316 0.8961321 -1.0758569 -0.5224844 -0.0310742 -1.7347012 0.0034610 2.1939227 -0.5145975 med_high
155 -0.4872402 1.2307270 3.6647712 2.7296452 -0.2215067 0.9742880 -0.9714740 -0.5224844 -0.0310742 -1.7347012 -0.3905371 0.3454580 -0.6015814 med_high
156 -0.4872402 1.2307270 3.6647712 2.7296452 -0.1887719 0.4982475 -0.9733261 -0.5224844 -0.0310742 -1.7347012 -2.9428165 0.3314545 -0.7538032 med_high
161 -0.4872402 1.2307270 3.6647712 0.4341211 -0.0492934 0.8535016 -0.9482040 -0.5224844 -0.0310742 -1.7347012 -0.1944691 -1.0016807 0.4857174 med_high
165 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.6129005 0.8250813 -0.6520568 -0.5224844 -0.0310742 -1.7347012 0.4210091 -0.1418645 0.0181789 med_high
166 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.2613577 0.8677118 -0.7178779 -0.5224844 -0.0310742 -1.7347012 -1.2762386 -0.3981289 0.2682577 med_high
171 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5830122 0.9245525 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -0.7052317 0.2488337 -0.5580894 med_high
174 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.1869661 0.5515356 -0.5455370 -0.5224844 -0.6659492 -0.8570810 0.4252810 -0.5059560 0.1160358 med_low
179 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.8188892 0.2069391 -0.4177891 -0.5224844 -0.6659492 -0.8570810 0.3789476 -0.8028307 0.8010341 low
180 -0.4872402 -1.2647715 -0.2723291 -0.5755644 0.9896793 -0.3614676 -0.4587729 -0.7521778 -1.2770905 -0.3027945 0.4406159 -1.0660969 1.5947622 low
182 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.2001579 -0.2264710 -0.5685221 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.4485416 1.4860323 low
188 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.7078757 -0.9760573 -0.0030596 -0.5224844 -0.0607412 -1.5037485 0.4074267 -0.8364391 1.0293668 low
192 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.6466760 -1.3419691 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3618601 -1.1151092 0.8662720 low
198 2.9429307 -1.4017907 -0.2723291 -1.3004667 1.1704320 -1.1359217 1.6687754 -0.8670245 -0.4701466 -2.7047025 -0.0258945 -0.5661712 0.8445260 low
206 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.5602402 -1.6439351 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4406159 -0.2496916 0.0073060 med_low
207 -0.4872402 -0.0797012 -0.2723291 -0.5669346 0.0588736 -0.5710675 0.2658758 -0.6373311 -0.7786840 0.0667298 0.4183803 -0.2356881 0.2030197 med_low
212 -0.4872402 -0.0797012 3.6647712 -0.5669346 -1.2533631 0.7114000 -0.0617572 -0.6373311 -0.7786840 0.0667298 0.4224331 1.5861699 -0.3515026 med_high
213 -0.4872402 -0.0797012 3.6647712 -0.5669346 -0.6797932 -0.5248845 -0.0676459 -0.6373311 -0.7786840 0.0667298 0.3753329 0.4728900 -0.0144400 med_low
216 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.1460744 -0.9298742 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4047979 -0.4457409 0.2682577 med_low
223 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.8459310 0.3241729 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.3693085 -0.3813247 0.5400824 med_high
232 -0.4872402 -0.7196100 -0.2723291 -0.4374877 1.6045233 0.2957526 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2132208 -1.0366895 0.9967478 med_high
233 -0.4872402 -0.7196100 -0.2723291 -0.4115983 2.9210298 0.1678611 0.0205904 -0.1779443 -0.6006817 -0.4875567 0.3202367 -1.4259873 2.0840466 med_high
240 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.4573837 -0.9369793 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2969057 -0.7398148 0.0834169 med_low
245 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9843688 0.2815424 1.9755128 -0.2927910 -0.4642132 0.2976825 0.1732405 -0.0214342 -0.5363434 med_low
249 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2111614 -0.6918540 1.9145357 -0.2927910 -0.4642132 0.2976825 0.1975573 -0.4387391 0.2138927 med_low
256 2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.5815890 -1.7576164 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.4217758 -0.4765487 -0.1775348 low
257 3.3717021 -1.0767345 -0.2723291 -1.3867646 1.6642999 -1.2211827 1.2067460 -0.7521778 -0.9744866 -1.1804147 0.3249467 -1.3363648 2.3341253 low
262 0.3703025 -1.0446662 -0.2723291 0.7965722 1.7582344 0.7398203 -0.7860734 -0.5224844 -0.8558183 -2.5199404 0.3471824 -0.7552187 2.2362684 med_high
268 0.3703025 -1.0446662 -0.2723291 0.1752274 2.8640998 -0.0559490 -0.6522468 -0.5224844 -0.8558183 -2.5199404 0.3052304 -0.7300124 2.9865046 med_high
270 0.3703025 -0.6088285 3.6647712 -0.7826793 -0.5189660 -0.2513388 0.0581549 -0.7521778 -1.0990882 0.0667298 0.3797143 0.1396062 -0.1992808 med_low
271 0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.6100540 -0.9405319 0.3010658 -0.7521778 -1.0990882 0.0667298 0.3502494 0.0485834 -0.1557889 med_high
285 3.3717021 -1.1904313 -0.2723291 -1.3349859 1.1433903 -1.6972232 1.6679681 -0.9818712 -0.7312167 -1.4575580 0.4167372 -0.6725979 1.0511128 low
287 2.9429307 -1.3668070 -0.2723291 -1.4644327 -0.0777584 -1.3171013 2.5141909 -0.9818712 -0.9922868 -0.1180323 -0.1651137 0.0387809 -0.2645187 low
292 2.9429307 -0.9018164 -0.2723291 -1.2400582 1.2287853 -1.4520979 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.2733490 1.6056352 low
302 0.9705825 -0.7356442 -0.2723291 -1.0502028 0.4346117 -1.0009251 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.4280193 -0.4415399 -0.0579320 low
309 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4986579 0.4946949 -0.2267846 -0.6373311 -0.6184819 -0.0256513 0.4406159 -1.1361145 0.0290519 med_high
315 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4018769 0.6652169 -0.0915333 -0.6373311 -0.6184819 -0.0256513 0.4273621 -0.4723476 0.1377818 med_high
319 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.1385756 -0.0488439 -0.1246813 -0.6373311 -0.6184819 -0.0256513 0.4221044 -0.3211096 0.0616709 med_high
329 -0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.5929750 -1.5195961 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2822280 -0.3757233 -0.3515026 low
337 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5915517 -0.7913251 0.6819824 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.3995293 -0.3297567 low
338 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5545472 -0.3188371 0.8642962 -0.5224844 -1.0931548 0.8057784 0.4177230 -0.2931025 -0.4384865 low
340 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4264547 -0.8232980 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4079314 -0.3841216 low
341 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4506500 -0.3579151 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4709472 -0.4167406 low
347 -0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.5502775 -0.5781726 2.0033894 -0.7521778 -0.3336782 0.1591109 0.0869268 0.0023717 -0.5798354 low
357 -0.4872402 1.0149946 3.6647712 1.8580364 -0.1033769 1.0240236 -0.7944316 1.6596029 1.5294129 0.8057784 0.2306369 0.6927453 -0.5145975 high
359 -0.4872402 1.0149946 3.6647712 1.8580364 -0.2243532 0.5266678 -0.5092547 1.6596029 1.5294129 0.8057784 0.4245142 -0.1642701 0.0181789 high
360 -0.4872402 1.0149946 -0.2723291 1.8580364 -0.2457019 0.4520644 -0.6106931 1.6596029 1.5294129 0.8057784 0.3731422 0.0023717 0.0073060 high
373 -0.4872402 1.0149946 3.6647712 0.9777978 -0.5830122 0.7469254 -1.2658165 1.6596029 1.5294129 0.8057784 -0.0963256 -0.5283617 2.9865046 high
374 -0.4872402 1.0149946 -0.2723291 0.9777978 -1.9621417 1.1163897 -1.2446360 1.6596029 1.5294129 0.8057784 0.4406159 3.0971497 -0.9495170 high
378 -0.4872402 1.0149946 -0.2723291 1.0036872 0.7249547 1.0737592 -1.1573496 1.6596029 1.5294129 0.8057784 0.4406159 1.2024734 -1.0038819 high
393 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.7771192 1.0098134 -0.9616911 1.6596029 1.5294129 0.8057784 0.4406159 1.8242297 -1.3953095 high
403 -0.4872402 1.0149946 -0.2723291 1.1935426 0.1698871 1.1163897 -1.0239029 1.6596029 1.5294129 0.8057784 0.2128922 1.0722407 -1.1343578 high
413 -0.4872402 1.0149946 -0.2723291 0.3650828 -2.3578052 1.1163897 -1.0643168 1.6596029 1.5294129 0.8057784 -3.5914839 3.0411357 -0.5037245 high
414 -0.4872402 1.0149946 -0.2723291 0.3650828 -1.6077524 1.1163897 -1.0474579 1.6596029 1.5294129 0.8057784 -1.5959718 1.0400326 -0.6776923 high
423 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.9060900 0.6758745 -0.8756394 1.6596029 1.5294129 0.8057784 -0.7133373 0.2026221 -0.1884078 high
426 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.5531240 0.9529728 -0.8953952 1.6596029 1.5294129 0.8057784 -3.8227126 1.6435843 -1.5475313 high
434 -0.4872402 1.0149946 -0.2723291 1.3661384 0.2154311 0.6865322 -0.7024911 1.6596029 1.5294129 0.8057784 -2.8094026 0.4994967 -0.8951520 high
435 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.1090699 0.9387626 -0.7469417 1.6596029 1.5294129 0.8057784 -2.8045831 0.3524598 -1.1778497 high
436 -0.4872402 1.0149946 -0.2723291 1.5991427 0.4901184 0.9245525 -0.7932444 1.6596029 1.5294129 0.8057784 -2.7035916 1.4867449 -0.9930089 high
446 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2481659 0.9316575 -0.8582106 1.6596029 1.5294129 0.8057784 -3.4351771 1.5861699 -1.1669767 high
452 -0.4872402 1.0149946 -0.2723291 1.3661384 0.5271229 1.0524439 -0.6837801 1.6596029 1.5294129 0.8057784 -0.0151600 0.7109499 -0.7972951 high
453 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0175994 0.8250813 -0.6776064 1.6596029 1.5294129 0.8057784 0.3112548 0.6465337 -0.6994382 high
458 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.4961940 0.4165390 -0.4824229 1.6596029 1.5294129 0.8057784 -3.8684983 0.6003221 -0.9821359 high
468 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.4008362 0.9209999 -0.5958763 1.6596029 1.5294129 0.8057784 -0.2780445 1.2136763 -0.3732486 high
469 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.5104265 0.0861526 -0.4210659 1.6596029 1.5294129 0.8057784 0.1321648 0.7669640 -0.3732486 high
471 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.1674232 0.5479830 -0.3617035 1.6596029 1.5294129 0.8057784 0.4406159 0.5092992 -0.2862647 high
478 -0.4872402 1.0149946 -0.2723291 0.5117892 -1.3956881 1.0204711 -0.8046419 1.6596029 1.5294129 0.8057784 -0.0788000 1.7164026 -1.1452307 high
485 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.5887052 -0.9476370 -0.0337381 1.6596029 1.5294129 0.8057784 0.1539623 0.0961953 -0.2101538 med_high
487 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.2428554 0.3987763 -0.1183177 1.6596029 1.5294129 0.8057784 0.3943920 0.3258531 -0.3732486 high
489 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.1822006 0.8570542 -0.9375187 -0.6373311 1.7964164 0.7595879 0.4207900 0.7571615 -0.7972951 med_low
490 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.2391306 1.0559965 -0.9686246 -0.6373311 1.7964164 0.7595879 -0.1382776 1.5847695 -1.6888801 med_low
491 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.6959939 1.0453389 -0.9367114 -0.6373311 1.7964164 0.7595879 -0.4189067 2.3843705 -1.5692773 med_low
495 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.5104265 -0.9227692 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 0.1312041 0.2138927 med_high
502 -0.4872402 0.1156240 -0.2723291 0.1579678 0.4388814 0.0186544 -0.6251775 -0.9818712 -0.8024176 1.1753027 0.3868341 -0.4177339 -0.0144400 low
# save the correct classes from test data
correct_classes <- boston_test$crime
correct_classes
##   [1] low      med_low  med_high med_high med_high med_high low     
##   [8] med_low  low      med_low  med_low  low      low      low     
##  [15] med_low  med_low  low      med_low  med_low  low      low     
##  [22] med_low  med_low  med_low  med_low  med_low  med_high med_high
##  [29] med_high med_high med_high med_high med_high med_high med_high
##  [36] med_low  low      low      low      low      low      low     
##  [43] med_low  med_low  med_high med_low  med_low  med_high med_high
##  [50] med_high med_low  med_low  med_low  low      low      med_high
##  [57] med_high med_low  med_high low      low      low      low     
##  [64] med_high med_high med_high low      low      low      low     
##  [71] low      low      high     high     high     high     high    
##  [78] high     high     high     high     high     high     high    
##  [85] high     high     high     high     high     high     high    
##  [92] high     high     high     high     med_high high     med_low 
##  [99] med_low  med_low  med_high low     
## Levels: low med_low med_high high
# remove the crime variable from test data set / so the prediction can be done later!
boston_test <- dplyr::select(boston_test, -crime)
knitr::kable(boston_test, caption = "Test data without the crime variable") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center",) %>% 
  scroll_box(width = "100%", height = "300px") # the data frame head
Test data without the crime variable
zn indus chas nox rm age dis rad tax ptratio black lstat medv
4 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.0152978 -0.8090878 1.0766711 -0.7521778 -1.1050216 0.1129203 0.4157514 -1.3601708 1.1815886
7 0.0487240 -0.4761823 -0.2723291 -0.2648919 -0.3880270 -0.0701592 0.8384142 -0.5224844 -0.5769480 -1.5037485 0.4263763 -0.0312367 0.0399249
19 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -1.1793541 -1.1359217 0.0006921 -0.6373311 -0.6006817 1.1753027 -0.7413783 -0.1348628 -0.2536457
20 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.7936533 0.0328645 0.0006921 -0.6373311 -0.6006817 1.1753027 0.3754425 -0.1922772 -0.4711055
24 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.6712537 1.1163897 0.1425445 -0.6373311 -0.6006817 1.1753027 0.4147656 1.0120256 -0.8734060
31 -0.4872402 -0.4368257 -0.2723291 -0.1440749 -0.8135788 0.9067897 0.2079856 -0.6373311 -0.6006817 1.1753027 0.0382932 1.3929213 -1.0691198
36 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.5004637 -0.0133185 -0.2064589 -0.5224844 -0.7668172 0.3438730 0.4406159 -0.4163335 -0.3949946
37 -0.4872402 -0.7545936 -0.2723291 -0.4806367 -0.6314027 -0.2548913 -0.1981007 -0.5224844 -0.7668172 0.3438730 0.2287748 -0.1740726 -0.2753917
40 2.7285450 -1.1933466 -0.2723291 -1.0933517 0.4417279 -1.6616978 0.7627153 -0.7521778 -0.9270193 -0.0718418 0.4267049 -1.1669222 0.8988910
43 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.1645767 -2.2016841 0.9145880 -0.7521778 -1.0397541 -0.2566040 0.2924148 -0.9582698 0.3008766
47 -0.4872402 -0.6161168 -0.2723291 -0.9207559 -0.7096815 -1.2531555 0.6199131 -0.7521778 -1.0397541 -0.2566040 0.4406159 0.2096238 -0.2753917
52 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.2414322 -0.1980507 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4085221 -0.4513423 -0.2210268
54 0.4131797 -0.8012385 -0.2723291 -0.9984241 -0.4079525 -1.6759080 1.4340328 -0.6373311 -0.9804200 -0.7646999 0.4406159 -0.5913775 0.0942899
56 3.3717021 -1.4455202 -0.2723291 -1.3090965 1.3725336 -1.6581453 2.3277455 -0.5224844 -1.0812880 -0.2566040 0.4299910 -1.0983050 1.3990484
62 0.5846882 -0.8755787 -0.2723291 -0.8776070 -0.4534965 0.8819220 1.4358374 -0.1779443 -0.7371501 0.5748257 0.2344707 0.2502341 -0.7103112
63 0.5846882 -0.8755787 -0.2723291 -0.8776070 0.2438961 -0.0275287 1.6291213 -0.1779443 -0.7371501 0.5748257 0.4406159 -0.8294374 -0.0361860
67 2.9429307 -1.1321252 -0.2723291 -1.3522454 -0.7082582 -1.3313114 1.3375333 -0.6373311 -0.4226793 -1.0880337 0.4406159 -0.3379138 -0.3406296
70 0.0487240 -0.7385595 -0.2723291 -1.2573178 -0.5687797 -1.2638131 1.2836322 -0.6373311 -0.3752120 0.2053014 0.4406159 -0.5409648 -0.1775348
76 -0.4872402 0.2468126 -0.2723291 -1.0156836 0.0019436 -0.8375082 0.3360184 -0.5224844 -0.0607412 0.1129203 0.2908813 -0.5199596 -0.1231699
85 -0.4872402 -0.9688683 -0.2723291 -0.9121262 0.1485384 -0.7309319 0.4674705 -0.7521778 -0.9566863 0.0205393 0.4406159 -0.4247356 0.1486548
94 0.7133196 0.5689534 -0.2723291 -0.7826793 -0.1048002 -1.4094674 -0.0613298 -0.6373311 -0.8202179 -0.1180323 0.4343724 -0.9022557 0.2682577
105 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1674232 0.7611355 -0.6525317 -0.5224844 -0.1438090 1.1291122 0.3945016 -0.0452402 -0.2645187
107 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.6385190 0.8286338 -0.7522606 -0.5224844 -0.1438090 1.1291122 0.4271431 0.8411826 -0.3297567
111 -0.4872402 -0.3756044 -0.2723291 -0.2994111 -0.1275722 -0.5035693 -0.4830877 -0.5224844 -0.1438090 1.1291122 0.4032644 0.0485834 -0.0905509
116 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.5075800 0.6971898 -0.6325385 -0.4076377 0.1409947 -0.3027945 -0.1288575 0.4350805 -0.4602325
118 -0.4872402 -0.1642450 -0.2723291 -0.0664067 -0.3752177 0.4982475 -0.4975246 -0.4076377 0.1409947 -0.3027945 0.4144370 -0.3295117 -0.3623756
128 -0.4872402 1.5674443 -0.2723291 0.5980871 -0.8420438 0.9742880 -0.9530004 -0.6373311 0.1706618 1.2676838 0.3881485 0.6353309 -0.6885653
133 -0.4872402 1.5674443 -0.2723291 0.5980871 0.1243431 1.0417863 -0.6969823 -0.6373311 0.1706618 1.2676838 0.3185937 -0.2146828 0.0507979
149 -0.4872402 1.2307270 -0.2723291 2.7296452 -1.5636316 0.8961321 -1.0758569 -0.5224844 -0.0310742 -1.7347012 0.0034610 2.1939227 -0.5145975
155 -0.4872402 1.2307270 3.6647712 2.7296452 -0.2215067 0.9742880 -0.9714740 -0.5224844 -0.0310742 -1.7347012 -0.3905371 0.3454580 -0.6015814
156 -0.4872402 1.2307270 3.6647712 2.7296452 -0.1887719 0.4982475 -0.9733261 -0.5224844 -0.0310742 -1.7347012 -2.9428165 0.3314545 -0.7538032
161 -0.4872402 1.2307270 3.6647712 0.4341211 -0.0492934 0.8535016 -0.9482040 -0.5224844 -0.0310742 -1.7347012 -0.1944691 -1.0016807 0.4857174
165 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.6129005 0.8250813 -0.6520568 -0.5224844 -0.0310742 -1.7347012 0.4210091 -0.1418645 0.0181789
166 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.2613577 0.8677118 -0.7178779 -0.5224844 -0.0310742 -1.7347012 -1.2762386 -0.3981289 0.2682577
171 -0.4872402 1.2307270 -0.2723291 0.4341211 -0.5830122 0.9245525 -0.6502047 -0.5224844 -0.0310742 -1.7347012 -0.7052317 0.2488337 -0.5580894
174 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.1869661 0.5515356 -0.5455370 -0.5224844 -0.6659492 -0.8570810 0.4252810 -0.5059560 0.1160358
179 -0.4872402 -1.0330050 -0.2723291 -0.3857090 0.8188892 0.2069391 -0.4177891 -0.5224844 -0.6659492 -0.8570810 0.3789476 -0.8028307 0.8010341
180 -0.4872402 -1.2647715 -0.2723291 -0.5755644 0.9896793 -0.3614676 -0.4587729 -0.7521778 -1.2770905 -0.3027945 0.4406159 -1.0660969 1.5947622
182 -0.4872402 -1.2647715 -0.2723291 -0.5755644 -0.2001579 -0.2264710 -0.5685221 -0.7521778 -1.2770905 -0.3027945 0.4406159 -0.4485416 1.4860323
188 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.7078757 -0.9760573 -0.0030596 -0.5224844 -0.0607412 -1.5037485 0.4074267 -0.8364391 1.0293668
192 1.4422310 -1.1219217 -0.2723291 -1.0156836 0.6466760 -1.3419691 1.2749890 -0.5224844 -0.0607412 -1.5037485 0.3618601 -1.1151092 0.8662720
198 2.9429307 -1.4017907 -0.2723291 -1.3004667 1.1704320 -1.1359217 1.6687754 -0.8670245 -0.4701466 -2.7047025 -0.0258945 -0.5661712 0.8445260
206 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.5602402 -1.6439351 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4406159 -0.2496916 0.0073060
207 -0.4872402 -0.0797012 -0.2723291 -0.5669346 0.0588736 -0.5710675 0.2658758 -0.6373311 -0.7786840 0.0667298 0.4183803 -0.2356881 0.2030197
212 -0.4872402 -0.0797012 3.6647712 -0.5669346 -1.2533631 0.7114000 -0.0617572 -0.6373311 -0.7786840 0.0667298 0.4224331 1.5861699 -0.3515026
213 -0.4872402 -0.0797012 3.6647712 -0.5669346 -0.6797932 -0.5248845 -0.0676459 -0.6373311 -0.7786840 0.0667298 0.3753329 0.4728900 -0.0144400
216 -0.4872402 -0.0797012 -0.2723291 -0.5669346 -0.1460744 -0.9298742 0.0714046 -0.6373311 -0.7786840 0.0667298 0.4047979 -0.4457409 0.2682577
223 -0.4872402 -0.7196100 3.6647712 -0.4115983 0.8459310 0.3241729 -0.2483451 -0.1779443 -0.6006817 -0.4875567 0.3693085 -0.3813247 0.5400824
232 -0.4872402 -0.7196100 -0.2723291 -0.4374877 1.6045233 0.2957526 -0.0586703 -0.1779443 -0.6006817 -0.4875567 0.2132208 -1.0366895 0.9967478
233 -0.4872402 -0.7196100 -0.2723291 -0.4115983 2.9210298 0.1678611 0.0205904 -0.1779443 -0.6006817 -0.4875567 0.3202367 -1.4259873 2.0840466
240 0.7990739 -0.9047317 -0.2723291 -1.0933517 0.4573837 -0.9369793 1.1373158 -0.4076377 -0.6422155 -0.8570810 0.2969057 -0.7398148 0.0834169
245 0.4560568 -0.7691701 -0.2723291 -1.0674624 -0.9843688 0.2815424 1.9755128 -0.2927910 -0.4642132 0.2976825 0.1732405 -0.0214342 -0.5363434
249 0.4560568 -0.7691701 -0.2723291 -1.0674624 0.2111614 -0.6918540 1.9145357 -0.2927910 -0.4642132 0.2976825 0.1975573 -0.4387391 0.2138927
256 2.9429307 -1.0927687 -0.2723291 -1.4040242 -0.5815890 -1.7576164 2.5764502 -0.9818712 -0.5532144 -0.9494620 0.4217758 -0.4765487 -0.1775348
257 3.3717021 -1.0767345 -0.2723291 -1.3867646 1.6642999 -1.2211827 1.2067460 -0.7521778 -0.9744866 -1.1804147 0.3249467 -1.3363648 2.3341253
262 0.3703025 -1.0446662 -0.2723291 0.7965722 1.7582344 0.7398203 -0.7860734 -0.5224844 -0.8558183 -2.5199404 0.3471824 -0.7552187 2.2362684
268 0.3703025 -1.0446662 -0.2723291 0.1752274 2.8640998 -0.0559490 -0.6522468 -0.5224844 -0.8558183 -2.5199404 0.3052304 -0.7300124 2.9865046
270 0.3703025 -0.6088285 3.6647712 -0.7826793 -0.5189660 -0.2513388 0.0581549 -0.7521778 -1.0990882 0.0667298 0.3797143 0.1396062 -0.1992808
271 0.3703025 -0.6088285 -0.2723291 -0.7826793 -0.6100540 -0.9405319 0.3010658 -0.7521778 -1.0990882 0.0667298 0.3502494 0.0485834 -0.1557889
285 3.3717021 -1.1904313 -0.2723291 -1.3349859 1.1433903 -1.6972232 1.6679681 -0.9818712 -0.7312167 -1.4575580 0.4167372 -0.6725979 1.0511128
287 2.9429307 -1.3668070 -0.2723291 -1.4644327 -0.0777584 -1.3171013 2.5141909 -0.9818712 -0.9922868 -0.1180323 -0.1651137 0.0387809 -0.2645187
292 2.9429307 -0.9018164 -0.2723291 -1.2400582 1.2287853 -1.4520979 0.6276540 -0.6373311 -0.9685531 0.3438730 0.4406159 -1.2733490 1.6056352
302 0.9705825 -0.7356442 -0.2723291 -1.0502028 0.4346117 -1.0009251 0.8057412 -0.2927910 -0.4701466 -1.0880337 0.4280193 -0.4415399 -0.0579320
309 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4986579 0.4946949 -0.2267846 -0.6373311 -0.6184819 -0.0256513 0.4406159 -1.1361145 0.0290519
315 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.4018769 0.6652169 -0.0915333 -0.6373311 -0.6184819 -0.0256513 0.4273621 -0.4723476 0.1377818
319 -0.4872402 -0.1802792 -0.2723291 -0.0922961 0.1385756 -0.0488439 -0.1246813 -0.6373311 -0.6184819 -0.0256513 0.4221044 -0.3211096 0.0616709
329 -0.4872402 -1.1510747 -0.2723291 -0.8171985 -0.5929750 -1.5195961 0.6741466 -0.6373311 0.1291279 -0.7185093 0.2822280 -0.3757233 -0.3515026
337 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5915517 -0.7913251 0.6819824 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.3995293 -0.3297567
338 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.5545472 -0.3188371 0.8642962 -0.5224844 -1.0931548 0.8057784 0.4177230 -0.2931025 -0.4384865
340 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4264547 -0.8232980 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4079314 -0.3841216
341 -0.4872402 -0.8668328 -0.2723291 -0.3425600 -0.4506500 -0.3579151 0.4830472 -0.5224844 -1.0931548 0.8057784 0.4406159 -0.4709472 -0.4167406
347 -0.4872402 -0.9834448 -0.2723291 -0.9725347 -0.5502775 -0.5781726 2.0033894 -0.7521778 -0.3336782 0.1591109 0.0869268 0.0023717 -0.5798354
357 -0.4872402 1.0149946 3.6647712 1.8580364 -0.1033769 1.0240236 -0.7944316 1.6596029 1.5294129 0.8057784 0.2306369 0.6927453 -0.5145975
359 -0.4872402 1.0149946 3.6647712 1.8580364 -0.2243532 0.5266678 -0.5092547 1.6596029 1.5294129 0.8057784 0.4245142 -0.1642701 0.0181789
360 -0.4872402 1.0149946 -0.2723291 1.8580364 -0.2457019 0.4520644 -0.6106931 1.6596029 1.5294129 0.8057784 0.3731422 0.0023717 0.0073060
373 -0.4872402 1.0149946 3.6647712 0.9777978 -0.5830122 0.7469254 -1.2658165 1.6596029 1.5294129 0.8057784 -0.0963256 -0.5283617 2.9865046
374 -0.4872402 1.0149946 -0.2723291 0.9777978 -1.9621417 1.1163897 -1.2446360 1.6596029 1.5294129 0.8057784 0.4406159 3.0971497 -0.9495170
378 -0.4872402 1.0149946 -0.2723291 1.0036872 0.7249547 1.0737592 -1.1573496 1.6596029 1.5294129 0.8057784 0.4406159 1.2024734 -1.0038819
393 -0.4872402 1.0149946 -0.2723291 1.2539511 -1.7771192 1.0098134 -0.9616911 1.6596029 1.5294129 0.8057784 0.4406159 1.8242297 -1.3953095
403 -0.4872402 1.0149946 -0.2723291 1.1935426 0.1698871 1.1163897 -1.0239029 1.6596029 1.5294129 0.8057784 0.2128922 1.0722407 -1.1343578
413 -0.4872402 1.0149946 -0.2723291 0.3650828 -2.3578052 1.1163897 -1.0643168 1.6596029 1.5294129 0.8057784 -3.5914839 3.0411357 -0.5037245
414 -0.4872402 1.0149946 -0.2723291 0.3650828 -1.6077524 1.1163897 -1.0474579 1.6596029 1.5294129 0.8057784 -1.5959718 1.0400326 -0.6776923
423 -0.4872402 1.0149946 -0.2723291 0.5117892 -0.9060900 0.6758745 -0.8756394 1.6596029 1.5294129 0.8057784 -0.7133373 0.2026221 -0.1884078
426 -0.4872402 1.0149946 -0.2723291 1.0727255 -0.5531240 0.9529728 -0.8953952 1.6596029 1.5294129 0.8057784 -3.8227126 1.6435843 -1.5475313
434 -0.4872402 1.0149946 -0.2723291 1.3661384 0.2154311 0.6865322 -0.7024911 1.6596029 1.5294129 0.8057784 -2.8094026 0.4994967 -0.8951520
435 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.1090699 0.9387626 -0.7469417 1.6596029 1.5294129 0.8057784 -2.8045831 0.3524598 -1.1778497
436 -0.4872402 1.0149946 -0.2723291 1.5991427 0.4901184 0.9245525 -0.7932444 1.6596029 1.5294129 0.8057784 -2.7035916 1.4867449 -0.9930089
446 -0.4872402 1.0149946 -0.2723291 1.5991427 0.2481659 0.9316575 -0.8582106 1.6596029 1.5294129 0.8057784 -3.4351771 1.5861699 -1.1669767
452 -0.4872402 1.0149946 -0.2723291 1.3661384 0.5271229 1.0524439 -0.6837801 1.6596029 1.5294129 0.8057784 -0.0151600 0.7109499 -0.7972951
453 -0.4872402 1.0149946 -0.2723291 1.3661384 0.0175994 0.8250813 -0.6776064 1.6596029 1.5294129 0.8057784 0.3112548 0.6465337 -0.6994382
458 -0.4872402 1.0149946 -0.2723291 1.3661384 -0.4961940 0.4165390 -0.4824229 1.6596029 1.5294129 0.8057784 -3.8684983 0.6003221 -0.9821359
468 -0.4872402 1.0149946 -0.2723291 0.2528955 -0.4008362 0.9209999 -0.5958763 1.6596029 1.5294129 0.8057784 -0.2780445 1.2136763 -0.3732486
469 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.5104265 0.0861526 -0.4210659 1.6596029 1.5294129 0.8057784 0.1321648 0.7669640 -0.3732486
471 -0.4872402 1.0149946 -0.2723291 0.2183763 -0.1674232 0.5479830 -0.3617035 1.6596029 1.5294129 0.8057784 0.4406159 0.5092992 -0.2862647
478 -0.4872402 1.0149946 -0.2723291 0.5117892 -1.3956881 1.0204711 -0.8046419 1.6596029 1.5294129 0.8057784 -0.0788000 1.7164026 -1.1452307
485 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.5887052 -0.9476370 -0.0337381 1.6596029 1.5294129 0.8057784 0.1539623 0.0961953 -0.2101538
487 -0.4872402 1.0149946 -0.2723291 0.2442657 -0.2428554 0.3987763 -0.1183177 1.6596029 1.5294129 0.8057784 0.3943920 0.3258531 -0.3732486
489 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.1822006 0.8570542 -0.9375187 -0.6373311 1.7964164 0.7595879 0.4207900 0.7571615 -0.7972951
490 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.2391306 1.0559965 -0.9686246 -0.6373311 1.7964164 0.7595879 -0.1382776 1.5847695 -1.6888801
491 -0.4872402 2.4201701 -0.2723291 0.4686402 -1.6959939 1.0453389 -0.9367114 -0.6373311 1.7964164 0.7595879 -0.4189067 2.3843705 -1.5692773
495 -0.4872402 -0.2108898 -0.2723291 0.2615253 -0.5104265 -0.9227692 -0.6711953 -0.4076377 -0.1022751 0.3438730 0.4406159 0.1312041 0.2138927
502 -0.4872402 0.1156240 -0.2723291 0.1579678 0.4388814 0.0186544 -0.6251775 -0.9818712 -0.8024176 1.1753027 0.3868341 -0.4177339 -0.0144400

1.3.4. Linear discriminant analysis of the training data set

Here we run now the LDA ananlysis on the ‘boston_train’ data set. So we make a classification of the “crime” variable of our training data set.

# linear discriminant analysis
lda.fit <- lda(crime ~ ., data = boston_train)

# print the lda.fit object
lda.fit
## Call:
## lda(crime ~ ., data = boston_train)
## 
## Prior probabilities of groups:
##       low   med_low  med_high      high 
## 0.2450495 0.2500000 0.2500000 0.2549505 
## 
## Group means:
##                   zn      indus        chas        nox          rm
## low       0.96776729 -0.8980913 -0.07348562 -0.8791063  0.50672302
## med_low  -0.07184139 -0.3303595 -0.03844192 -0.5777090 -0.06234221
## med_high -0.38365582  0.1784617  0.11748284  0.3922538  0.12142617
## high     -0.48724019  1.0170891 -0.11943197  1.0609119 -0.39758898
##                 age        dis        rad        tax    ptratio      black
## low      -0.8717059  0.8832840 -0.6941744 -0.7255230 -0.4637615  0.3823775
## med_low  -0.3785268  0.3804356 -0.5531860 -0.5303619 -0.1175750  0.3305797
## med_high  0.4228703 -0.3713792 -0.4042264 -0.3041286 -0.2524880  0.1095973
## high      0.8109401 -0.8704238  1.6384176  1.5142626  0.7811136 -0.7552212
##                lstat        medv
## low      -0.80551822  0.55721555
## med_low  -0.17588891  0.06296274
## med_high  0.01905118  0.17739824
## high      0.86812919 -0.71685614
## 
## Coefficients of linear discriminants:
##                  LD1         LD2        LD3
## zn       0.060099250  0.73620470 -0.9855179
## indus    0.008064383 -0.12780753  0.3095989
## chas    -0.101073981 -0.04681437  0.0461922
## nox      0.374799602 -0.85050153 -1.3408260
## rm      -0.089856978 -0.05820899 -0.2142469
## age      0.258592866 -0.23699091 -0.1745051
## dis     -0.057768768 -0.26744922  0.1558164
## rad      3.087398715  1.05894467  0.1457907
## tax      0.118815758 -0.14944306  0.3254184
## ptratio  0.121013500 -0.04833078 -0.3535688
## black   -0.128757895  0.03061410  0.1343696
## lstat    0.206677615 -0.24731102  0.3946104
## medv     0.182532860 -0.43817126 -0.1512821
## 
## Proportion of trace:
##    LD1    LD2    LD3 
## 0.9521 0.0361 0.0117
# the function for lda biplot arrows
lda.arrows <- function(x, myscale = 1, arrow_heads = 0.1, color = "red", tex = 0.75, choices = c(1,2)){
  heads <- coef(x)
  arrows(x0 = 0, y0 = 0, 
         x1 = myscale * heads[,choices[1]], 
         y1 = myscale * heads[,choices[2]], col=color, length = arrow_heads)
  text(myscale * heads[,choices], labels = row.names(heads), 
       cex = tex, col=color, pos=3)
}

# target classes as numeric
classes <- as.numeric(boston_train$crime)

# plot the lda results
plot(lda.fit, dimen = 2, col = classes, pch = classes, main = "LDA biplot of Boston training data")
lda.arrows(lda.fit, myscale = 2)

The linear discriminant analysis shows the classification of the “crime” variable. In the LDA plot we can see the separation of the different crime rates.
The LDA creates a value which characterize which variable discriminate most of the variables. So now we have performed the LDA on the training data set. Now we use the LDA outcome to perform a prediction on the crime rate on the testing data set.


1.3.5. Prediction of crime classes on the test data set

# predict classes with test data
lda.pred <- predict(lda.fit, newdata = boston_test)
lda.pred$class
##   [1] med_low  med_low  med_low  med_low  med_high med_high med_low 
##   [8] med_low  low      med_low  med_low  med_low  low      low     
##  [15] med_low  med_low  low      med_low  med_low  med_low  low     
##  [22] med_high med_high med_low  med_high med_low  med_high med_high
##  [29] med_high med_high med_high med_high med_high med_high med_high
##  [36] med_low  med_low  med_low  med_low  low      low      low     
##  [43] med_low  med_low  med_low  med_low  med_low  med_low  med_high
##  [50] med_high low      med_low  med_low  low      low      med_high
##  [57] med_high med_low  low      low      low      low      low     
##  [64] med_low  med_high med_low  med_low  med_low  med_low  med_low 
##  [71] med_low  med_low  high     high     high     high     high    
##  [78] high     high     high     high     high     high     high    
##  [85] high     high     high     high     high     high     high    
##  [92] high     high     high     high     high     high     med_high
##  [99] med_high med_high med_high med_high
## Levels: low med_low med_high high
# cross tabulate the results
pred_table <- table(correct = correct_classes, predicted = lda.pred$class)

knitr::kable(pred_table, align = "c",  caption="Pediction of test data" ) %>% kable_styling(bootstrap_options = "striped", full_width = F, position = "center") %>% add_header_above(c(" " = 1, "Predicted crime rate" = 4)) %>%  pack_rows("Correct rate", 1, 4)
Pediction of test data
Predicted crime rate
low med_low med_high high
Correct rate
low 14 13 1 0
med_low 1 18 6 0
med_high 1 6 17 1
high 0 0 0 24

It seems that the prediction is maybe not the best using the linear descriminant analysis.
The prediction showed not so good results on the lower crime rate predictions. Results show that the crime rate was predicted higher than it actually is (see in predicted med_low of and med_high which are actually low rates or med_low rates). In the medium crime level it also wrongly predicted quite a number of values, but on the high crime rates the predictions is quite good with just on value wrongly predicted.


1.3.5. Reload and standardize the Boston dataset and calculate the distances

Now we are going to perform a data clustering. This is different from classifications because in classifications the classes are defined before (in our case the different crime rate values from low to high). Here we analyise the data and through this analysis the clustering will show us were the data is different.

So here we are going to reload the orgininal ‘Boston’ data set and scale the data again to a new data frame ‘boston_scaled2’. Then we calculate the euclidian distances between the obsvervations. Then the k-means are calculated and

#load boston_scaled data set
data(Boston) # load the Boston data set

# scale the Boston data set again - named boston_scaled2
boston_scaled2 <- scale(Boston)
knitr::kable(head(boston_scaled2), caption = "Scaled Boston data set" ) %>% 
              kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center",) %>% 
              scroll_box(width = "100%", height = "300px") # the data frame head
Scaled Boston data set
crim zn indus chas nox rm age dis rad tax ptratio black lstat medv
-0.4193669 0.2845483 -1.2866362 -0.2723291 -0.1440749 0.4132629 -0.1198948 0.140075 -0.9818712 -0.6659492 -1.4575580 0.4406159 -1.0744990 0.1595278
-0.4169267 -0.4872402 -0.5927944 -0.2723291 -0.7395304 0.1940824 0.3668034 0.556609 -0.8670245 -0.9863534 -0.3027945 0.4406159 -0.4919525 -0.1014239
-0.4169290 -0.4872402 -0.5927944 -0.2723291 -0.7395304 1.2814456 -0.2655490 0.556609 -0.8670245 -0.9863534 -0.3027945 0.3960351 -1.2075324 1.3229375
-0.4163384 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.0152978 -0.8090878 1.076671 -0.7521778 -1.1050216 0.1129203 0.4157514 -1.3601708 1.1815886
-0.4120741 -0.4872402 -1.3055857 -0.2723291 -0.8344581 1.2273620 -0.5106743 1.076671 -0.7521778 -1.1050216 0.1129203 0.4406159 -1.0254866 1.4860323
-0.4166314 -0.4872402 -1.3055857 -0.2723291 -0.8344581 0.2068916 -0.3508100 1.076671 -0.7521778 -1.1050216 0.1129203 0.4101651 -1.0422909 0.6705582
# euclidean distance matrix
dist_eu <- dist(boston_scaled2)

# look at the summary of the distances
summary(dist_eu)
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.1343  3.4625  4.8241  4.9111  6.1863 14.3970
# manhattan distance matrix
dist_man <- dist(boston_scaled2, method = "manhattan")

# look at the summary of the distances
summary(dist_man)
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.2662  8.4832 12.6090 13.5488 17.7568 48.8618
# k-means clustering
km_boston <- kmeans(boston_scaled2, centers = 3)

# plot the Boston dataset with clusters
pairs(boston_scaled2, col = km_boston$cluster)

pairs(boston_scaled2[,1:5], col = km_boston$cluster)

pairs(boston_scaled2[,6:10], col = km_boston$cluster)

pairs(boston_scaled2[,11:14], col = km_boston$cluster)

I have performed the k-means clustering with several cluster numbers and think that 3 clusters are the number which allows to get the best overview over the data. The clustering worked good on crim variable, on the nitrogen oxides concentrations and also age, tax and the medv variable. Next we are checking were the k values are decreasing in a plot showing 10 clusters with the total within sum of squares (twcss). On the cluster level where this number decreases heavily that amount of clusters gives a good data overview.

1.3.5. Determine the k

set.seed(123)


# determine the number of clusters
k_max <- 10

# calculate the total within sum of squares
twcss <- sapply(1:k_max, function(k){kmeans(boston_scaled2, k)$tot.withinss})

# visualize the results
qplot(x = 1:k_max, y = twcss, geom = 'line')

# k-means clustering
km_boston <- kmeans(boston_scaled2, centers = 3)

# plot the Boston dataset with clusters
pairs(boston_scaled2, col = km_boston$cluster)

The twcss value decrease at a level of 2 - 3 clusters heavily. So the data I think is optimally clustered with three clusters, seen in the overview plot.

1.3.6. Bonus

  • Perform the LDA using the clusters as target.
  • Include all the variables in the Boston data in the LDA model.
  • Visualize the results with a biplot (include arrows representing the relationships of the original variables to the LDA solution)
data(Boston) # load the Boston data set

# scale the Boston data set again - named boston_scaled2
boston_scaled3 <- scale(Boston)

# k-means clustering
km_boston <-kmeans(boston_scaled3, centers = 3)
cluster <- km_boston$cluster

# add the cluster number to the dataframe
boston_scaled3 <- data.frame(boston_scaled3, cluster)

# linear discriminant analysis of clusters vs. all other variables
lda.fit_cluster <- lda(cluster ~ ., data = boston_scaled3)

# print the lda.fit object
lda.fit_cluster
## Call:
## lda(cluster ~ ., data = boston_scaled3)
## 
## Prior probabilities of groups:
##         1         2         3 
## 0.2470356 0.3260870 0.4268775 
## 
## Group means:
##         crim         zn      indus         chas        nox         rm
## 1 -0.3989700  1.2614609 -0.9791535 -0.020354653 -0.8573235  1.0090468
## 2  0.7982270 -0.4872402  1.1186734  0.014005495  1.1351215 -0.4596725
## 3 -0.3788713 -0.3578148 -0.2879024  0.001080671 -0.3709704 -0.2328004
##           age        dis        rad        tax     ptratio      black
## 1 -0.96130713  0.9497716 -0.5867985 -0.6709807 -0.80239137  0.3552363
## 2  0.79930921 -0.8549214  1.2113527  1.2873657  0.59162230 -0.6363367
## 3 -0.05427143  0.1034286 -0.5857564 -0.5951053  0.01241316  0.2805140
##        lstat        medv
## 1 -0.9571271  1.06668290
## 2  0.8622388 -0.67953738
## 3 -0.1047617 -0.09820229
## 
## Coefficients of linear discriminants:
##                 LD1         LD2
## crim    -0.03206338 -0.19094456
## zn       0.02935900 -1.07677218
## indus    0.63347352 -0.09917524
## chas     0.02460719  0.10009606
## nox      1.11749317 -0.75995105
## rm      -0.18841682 -0.57360135
## age     -0.12983139  0.47226685
## dis      0.04493809 -0.34585958
## rad      0.67004295 -0.08584353
## tax      1.03992455 -0.58075025
## ptratio  0.25864960 -0.02605279
## black   -0.01657236  0.01975686
## lstat    0.17365575 -0.41704235
## medv    -0.06819126 -0.79098605
## 
## Proportion of trace:
##    LD1    LD2 
## 0.8506 0.1494
# the function for lda biplot arrows
lda.arrows <- function(x, myscale = 1, arrow_heads = 0.1, color = "red", tex = 0.75, choices = c(1,2)){
  heads <- coef(x)
  arrows(x0 = 0, y0 = 0, 
         x1 = myscale * heads[,choices[1]], 
         y1 = myscale * heads[,choices[2]], col=color, length = arrow_heads)
  text(myscale * heads[,choices], labels = row.names(heads), 
       cex = tex, col=color, pos=3)
}

# target classes as numeric
classes3 <- as.numeric(boston_scaled3$cluster)
# plot the lda results

plot(lda.fit_cluster, dimen = 2, col = classes3, pch = classes3, main = "LDA biplot using three clusters 1, 2 and 3")
lda.arrows(lda.fit_cluster, myscale = 2)

So I scaled the Boston data set. Defined the k-means ‘km_boston’ clusters and added the cluster number to the scaled data frame. Then I ran the LDA model with the clusters against all other variables and added the arrows.
In the LDA biplot you can see how nicely the clusters are separated from each other - so the clustering actually worked. The most influencial linear separators seem to be “age”, “zn” (lands over2500 sq.ft.), “nox” (nitrogen oxide concentration) and “tax” (full-value property-tax rate).

1.3.7. Super-Bonus

#run the code for the scaled training data set
model_predictors <- dplyr::select(boston_train, -crime)

# check the dimensions
dim(model_predictors)
## [1] 404  13
dim(lda.fit$scaling)
## [1] 13  3
# matrix multiplication
matrix_product <- as.matrix(model_predictors) %*% lda.fit$scaling
matrix_product <- as.data.frame(matrix_product)

# installed plotly package and load it
library(plotly)

crimeplot <- plot_ly(x = matrix_product$LD1, y = matrix_product$LD2, z = matrix_product$LD3, type= 'scatter3d', mode='markers', color = boston_train$crime)
crimeplot
n_boston3 <- nrow(boston_scaled3)

# choose randomly 80% of the rows
ind_boston3 <- sample(n_boston3,  size = n_boston * 0.8)

# kmeans clustering of boston_scaled3
km_cluster <-  kmeans(boston_scaled3, centers = 3)
boston_scaled3$cluster <- km_cluster$cluster


clustertrain <- boston_scaled3[ind_boston3,]

clusterplot <- plot_ly(x = matrix_product$LD1, y = matrix_product$LD2, z = matrix_product$LD3, type= 'scatter3d', mode='markers', color = clustertrain)
clusterplot

The 3D plot of the crime training data set is visible here.


Chapter 5: Dimensionality reduction techniques

Data wrangling and performing principal component analysis (PCA) & multiple correspondence analysis (MCA)

Work of week 48 (25.11. - 01.12.2019)


1. Analysis of human data set

The data wrangling was done in two steps: a data frame combination of two data sets and as a second step a refining the data to those variables we want to analyse. The data wrangling scrip was uploaded to my GitHub repository. You can find the data wrangling script here.

1.1. Load the data set

# load necessary packages
library(tidyr)
library(dplyr)
library(corrplot)
library(ggplot2)
library(GGally)
library(knitr)
library(kableExtra)
library(stringr)
library(ggfortify)
library(factoextra)
# load the data set "human"
human <- read.table(file = 
             "C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/human_analys.txt")

1.2. The data set (overview and correlations)

The original complete data set from the United Nations was wrangled and combined into the so called “human” data set. This human data set consists of 8 variables with 155 observations. The data set includes following variables:

  • country –> Name of the country
  • ratio.sec.edu –> Ratio of second education of female/male
  • ratio.lab.force –> Ratio of labour forced female/male
  • edu.expect –> Expected years of schooling
  • life.exp –> Life expectancy at birth
  • GNI –> Gross National Income per capita
  • mat.mor.r –> Maternal mortality ratio
  • adol.birth –> Adolescent birth rate
  • rep.parliament –> Percetange of female representatives in parliament

The data was collected from the United Nations. More information about the data and how it was collected can be found here. Technical notes about calculating the human development indices can be found here.

Here you see the structure, the complete table and the summary of the wrangled human data set.

# check the data set "human"
str(human)
## 'data.frame':    155 obs. of  8 variables:
##  $ ratio.sec.edu  : num  1.007 0.997 0.983 0.989 0.969 ...
##  $ ratio.lab.force: num  0.891 0.819 0.825 0.884 0.829 ...
##  $ edu.expect     : num  17.5 20.2 15.8 18.7 17.9 16.5 18.6 16.5 15.9 19.2 ...
##  $ life.exp       : num  81.6 82.4 83 80.2 81.6 80.9 80.9 79.1 82 81.8 ...
##  $ GNI            : int  64992 42261 56431 44025 45435 43919 39568 52947 42155 32689 ...
##  $ mat.mor.r      : int  4 6 6 5 6 7 9 28 11 8 ...
##  $ adol.birth     : num  7.8 12.1 1.9 5.1 6.2 3.8 8.2 31 14.5 25.3 ...
##  $ rep.parliament : num  39.6 30.5 28.5 38 36.9 36.9 19.9 19.4 28.2 31.4 ...
#  data set table
knitr::kable(human) %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% 
  scroll_box(width = "100%", height = "300px")
ratio.sec.edu ratio.lab.force edu.expect life.exp GNI mat.mor.r adol.birth rep.parliament
Norway 1.0072389 0.8908297 17.5 81.6 64992 4 7.8 39.6
Australia 0.9968288 0.8189415 20.2 82.4 42261 6 12.1 30.5
Switzerland 0.9834369 0.8251001 15.8 83.0 56431 6 1.9 28.5
Denmark 0.9886128 0.8840361 18.7 80.2 44025 5 5.1 38.0
Netherlands 0.9690608 0.8286119 17.9 81.6 45435 6 6.2 36.9
Germany 0.9927835 0.8072289 16.5 80.9 43919 7 3.8 36.9
Ireland 1.0241730 0.7797357 18.6 80.9 39568 9 8.2 19.9
United States 1.0031646 0.8171263 16.5 79.1 52947 28 31.0 19.4
Canada 1.0000000 0.8676056 15.9 82.0 42155 11 14.5 28.2
New Zealand 0.9968520 0.8401084 19.2 81.8 32689 8 25.3 31.4
Singapore 0.9148148 0.7616580 15.4 83.0 76628 6 6.0 25.3
Sweden 0.9908362 0.8880707 15.8 82.2 45636 4 6.5 43.6
United Kingdom 0.9989990 0.8107715 16.2 80.7 39267 8 25.8 23.5
Iceland 0.9934498 0.9108527 19.0 82.6 35182 4 11.5 41.3
Korea (Republic of) 0.8641975 0.6948682 16.9 81.9 33890 27 2.2 16.3
Israel 0.9667812 0.8379161 16.0 82.4 30676 2 7.8 22.5
Luxembourg 1.0000000 0.7848297 13.9 81.7 58711 11 8.3 28.3
Japan 1.0139860 0.6931818 15.3 83.5 36927 6 5.4 11.6
Belgium 0.9348613 0.8010118 16.3 80.8 41187 6 6.7 42.4
France 0.9375000 0.8230519 16.0 82.2 38056 12 5.7 25.7
Austria 1.0000000 0.8064993 15.7 81.4 43869 4 4.1 30.3
Finland 1.0000000 0.8703125 17.1 80.8 38695 4 9.2 42.5
Slovenia 0.9775510 0.8275316 16.8 80.4 27852 7 0.6 27.7
Spain 0.9138167 0.7978723 17.3 82.6 32045 4 10.6 38.0
Italy 0.8844720 0.6655462 16.0 83.1 33030 4 4.0 30.1
Czech Republic 1.0020060 0.7481698 16.4 78.6 26660 5 4.9 18.9
Greece 0.8880597 0.7072000 17.6 80.9 24524 5 11.9 21.0
Estonia 1.0000000 0.8156749 16.5 76.8 25214 11 16.8 19.8
Cyprus 0.9302326 0.7876231 14.0 80.2 28633 10 5.5 12.5
Qatar 1.1305085 0.5319372 13.8 78.2 123124 6 9.5 0.0
Slovakia 0.9959799 0.7448980 15.1 76.3 25845 7 15.9 18.7
Poland 0.9286550 0.7534669 15.5 77.4 23177 3 12.2 22.1
Lithuania 0.9448568 0.8291233 16.4 73.3 24500 11 10.6 23.4
Malta 0.8772379 0.5716440 14.4 80.6 27930 9 18.2 13.0
Saudi Arabia 0.8605974 0.2579821 16.3 74.3 52821 16 10.2 19.9
Argentina 0.9774306 0.6333333 17.9 76.3 22050 69 54.4 36.8
United Arab Emirates 1.1944444 0.5054348 13.3 77.0 60868 8 27.6 17.5
Chile 0.9594241 0.6577540 15.2 81.7 21290 22 55.3 15.8
Portugal 0.9896266 0.8293051 16.3 80.9 25757 8 12.6 31.3
Hungary 0.9918946 0.7466667 15.4 75.2 22916 14 12.1 10.1
Bahrain 1.1031128 0.4510932 14.4 76.6 38599 22 13.8 15.0
Latvia 0.9989899 0.8121302 15.2 74.2 22281 13 13.5 18.0
Croatia 0.9081197 0.7654110 14.8 77.3 19409 13 12.7 25.8
Kuwait 0.9875666 0.5246691 14.7 74.4 83961 14 14.5 1.5
Montenegro 0.8891235 0.7504363 15.2 76.2 14558 7 15.2 17.3
Belarus 0.9436009 0.7939778 15.7 71.3 16676 1 20.6 30.1
Russian Federation 0.9686486 0.7963738 14.7 70.1 22352 24 25.7 14.5
Oman 0.8266200 0.3510896 13.6 76.8 34858 11 10.6 9.6
Romania 0.9358696 0.7503852 14.2 74.7 18108 33 31.0 12.0
Uruguay 1.0815109 0.7239583 15.5 77.2 19283 14 58.3 11.5
Bahamas 1.0410959 0.8738966 12.6 75.4 21336 37 28.5 16.7
Kazakhstan 0.9645749 0.8690629 15.0 69.4 20867 26 29.9 20.1
Barbados 1.0205245 0.8603133 15.4 75.6 12488 52 48.4 19.6
Bulgaria 0.9717868 0.8118644 14.4 74.2 15596 5 35.9 20.4
Panama 1.0821643 0.5990220 13.3 77.6 18192 85 78.5 19.3
Malaysia 0.9130435 0.5880795 12.7 74.7 22762 29 5.7 14.2
Mauritius 0.8517241 0.5876011 15.6 74.4 17470 73 30.9 11.6
Trinidad and Tobago 0.9802956 0.7019868 12.3 70.4 26090 84 34.8 24.7
Serbia 0.7934783 0.7307061 14.4 74.9 12190 16 16.9 34.0
Cuba 0.9428934 0.6200000 13.8 79.4 7301 80 43.1 48.9
Lebanon 0.9566787 0.3286319 13.8 79.3 16509 16 12.0 3.1
Costa Rica 1.0039604 0.5898734 13.9 79.4 13413 38 60.8 33.3
Iran (Islamic Republic of) 0.9201183 0.2255435 15.1 75.4 15440 23 31.6 3.1
Venezuela (Bolivarian Republic of) 1.1141732 0.6452020 14.2 74.2 16159 110 83.2 17.0
Turkey 0.6500000 0.4152542 14.5 75.3 18677 20 30.9 14.4
Sri Lanka 0.9515707 0.4600262 13.7 74.9 9779 29 16.9 5.8
Mexico 0.9191419 0.5644556 13.1 76.8 16056 49 63.4 37.1
Brazil 1.0419847 0.7351485 15.2 74.5 15175 69 70.8 9.6
Georgia 0.9676375 0.7523302 13.8 74.9 7164 41 46.8 11.3
Azerbaijan 0.9620123 0.9037356 11.9 70.8 16428 26 40.0 15.6
Jordan 0.8853503 0.2342342 13.5 74.0 11365 50 26.5 11.6
The former Yugoslav Republic of Macedonia 0.7230216 0.6385185 13.4 75.4 11780 7 18.3 33.3
Ukraine 0.9562044 0.7952167 15.1 71.0 8178 23 25.7 11.8
Algeria 0.8612903 0.2105263 14.0 74.8 13054 89 10.0 25.7
Peru 0.8517398 0.8080569 13.1 74.6 11015 89 50.7 22.3
Albania 0.9306030 0.6854962 11.8 77.8 9943 21 15.3 20.7
Armenia 0.9894737 0.7465565 12.3 74.7 8124 29 27.1 10.7
Bosnia and Herzegovina 0.6432665 0.5951134 13.6 76.5 9638 8 15.1 19.3
Ecuador 1.0177665 0.6614268 14.2 75.9 10605 87 77.0 41.6
China 0.8164117 0.8160920 13.1 75.8 12547 32 8.6 23.6
Fiji 0.9953488 0.5208333 15.7 70.0 7493 59 42.8 14.0
Mongolia 1.0142687 0.8167388 14.6 69.4 10729 68 18.7 14.9
Thailand 0.8750000 0.7967782 13.5 74.4 13323 26 41.0 6.1
Libya 1.3245823 0.3926702 14.0 71.6 14911 15 2.5 16.0
Tunisia 0.7114967 0.3540197 14.6 74.8 10404 46 4.6 31.3
Colombia 1.0233813 0.7001255 13.5 74.0 12040 83 68.5 20.9
Jamaica 1.0541311 0.7912553 12.4 75.7 7415 80 70.1 16.7
Tonga 0.9909400 0.7171582 14.7 72.8 5069 120 18.1 0.0
Belize 1.0079156 0.5978129 13.6 70.0 7614 45 71.4 13.3
Dominican Republic 1.0470810 0.6526718 13.1 73.5 11883 100 99.6 19.1
Suriname 0.9469214 0.5886628 12.7 71.1 15617 130 35.2 11.8
Maldives 0.8348624 0.7251613 13.0 76.8 12328 31 4.2 5.9
Samoa 1.0716667 0.4023973 12.9 73.4 5327 58 28.3 6.1
Botswana 0.9448010 0.8811275 12.5 64.5 16646 170 44.2 9.5
Moldova (Republic of) 0.9689441 0.8506787 11.9 71.6 5223 21 29.3 20.8
Egypt 0.7244224 0.3168449 13.5 71.1 10512 45 43.0 2.2
Gabon 1.4930748 0.8593272 12.5 64.4 16367 240 103.0 16.2
Indonesia 0.8109756 0.6104513 13.0 68.9 9788 190 48.3 17.1
Paraguay 0.8558140 0.6568396 11.9 72.9 7643 110 67.0 16.8
Philippines 1.0345369 0.6411543 11.3 68.2 7915 120 46.8 27.1
El Salvador 0.8440367 0.6050633 12.3 73.0 7349 69 76.0 27.4
South Africa 0.9578393 0.7355372 13.6 57.4 12122 140 50.9 40.7
Viet Nam 0.8342697 0.8880779 11.9 75.8 5092 49 29.0 24.3
Bolivia (Plurinational State of) 0.8054146 0.7935723 13.2 68.3 5760 200 71.9 51.8
Kyrgyzstan 0.9762397 0.7044025 12.5 70.6 3044 75 29.3 23.3
Iraq 0.5537849 0.2134670 10.1 69.4 14003 67 68.7 26.5
Guyana 1.2615063 0.5291925 10.3 66.4 6522 250 88.5 31.3
Nicaragua 1.0287206 0.5902864 11.5 74.9 4457 100 100.8 39.1
Morocco 0.6854305 0.3496042 11.6 74.0 6850 120 35.8 11.0
Namibia 0.9680233 0.8587127 11.3 64.8 9418 130 54.9 37.7
Guatemala 0.9439655 0.5589569 10.7 71.8 6929 140 97.2 13.3
Tajikistan 1.0427632 0.7639429 11.2 69.4 2517 44 42.8 15.2
India 0.4770318 0.3379224 11.7 68.0 5497 190 32.8 12.2
Honduras 1.0852713 0.5162847 11.1 73.1 3938 120 84.0 25.8
Bhutan 0.9855072 0.8639896 12.6 69.5 7176 120 40.9 8.3
Syrian Arab Republic 0.7283951 0.1856946 12.3 69.6 2728 49 41.6 12.4
Congo 0.8446809 0.9383562 11.1 62.3 6012 410 126.7 11.5
Zambia 0.5863636 0.8539720 13.5 60.1 3734 280 125.4 12.7
Ghana 0.6986090 0.9425770 11.5 61.4 3852 380 58.4 10.9
Bangladesh 0.8256659 0.6825208 10.0 71.6 3191 170 80.6 20.0
Cambodia 0.4323144 0.9109827 10.9 68.4 2949 170 44.3 19.0
Kenya 0.8057325 0.8591160 11.0 61.6 2762 400 93.6 20.8
Nepal 0.4633508 0.9173364 12.4 69.6 2311 190 73.7 29.5
Pakistan 0.4186551 0.2967431 7.8 66.2 4866 170 27.3 19.7
Myanmar 1.4967320 0.9137303 8.6 65.9 4608 200 12.1 4.7
Swaziland 0.8423077 0.6131285 11.3 49.0 5542 310 72.0 14.7
Tanzania (United Republic of) 0.5894737 0.9767184 9.2 65.0 2411 410 122.7 36.0
Cameroon 0.6103152 0.8307292 10.4 55.5 2803 590 115.8 27.1
Zimbabwe 0.7854839 0.9275362 10.9 57.5 1615 470 60.3 35.1
Mauritania 0.3971292 0.3628319 8.5 63.1 3560 320 73.3 22.2
Papua New Guinea 0.5241379 0.9527027 9.9 62.6 2463 220 62.1 2.7
Yemen 0.3220974 0.3518006 9.2 63.8 3519 270 47.0 0.7
Lesotho 1.1526316 0.8027211 11.1 49.8 3306 490 89.4 26.8
Togo 0.3995037 0.9913899 12.2 59.7 1228 450 91.5 17.6
Haiti 0.6363636 0.8577465 8.7 62.8 1669 380 42.0 3.5
Rwanda 0.9090909 1.0128957 10.3 64.2 1458 320 33.6 57.5
Uganda 0.6835821 0.9570707 9.8 58.5 1613 360 126.6 35.0
Benin 0.4185185 0.8633461 11.1 59.6 1767 340 90.2 8.4
Sudan 0.6648352 0.4118421 7.0 63.5 3809 360 84.0 23.8
Senegal 0.4675325 0.7500000 7.9 66.5 2188 320 94.4 42.7
Afghanistan 0.1979866 0.1987421 9.3 60.4 1885 400 86.8 27.6
Côte d’Ivoire 0.4651163 0.6437346 8.9 51.5 3171 720 130.3 9.2
Malawi 0.5138889 1.0380368 10.8 62.8 747 510 144.8 16.7
Ethiopia 0.4285714 0.8756999 8.5 64.1 1428 420 78.4 25.5
Gambia 0.5523810 0.8709288 8.8 60.2 1507 430 115.8 9.4
Congo (Democratic Republic of the) 0.3950617 0.9658470 9.8 58.7 680 730 135.3 8.2
Liberia 0.3918575 0.8981481 9.5 60.9 805 640 117.4 10.7
Mali 0.5099338 0.6240786 8.4 58.0 1583 550 175.6 9.5
Mozambique 0.2258065 1.0326087 9.3 55.1 1123 480 137.8 39.6
Sierra Leone 0.4608295 0.9521739 8.6 50.9 1780 1100 100.7 12.4
Burkina Faso 0.2812500 0.8566667 7.8 58.7 1591 400 115.4 13.3
Burundi 0.6385542 1.0158537 10.1 56.7 758 740 30.3 34.9
Chad 0.1717172 0.8080808 7.4 51.6 2085 980 152.0 14.9
Central African Republic 0.3782772 0.8531140 7.2 50.7 581 880 98.3 12.5
Niger 0.3076923 0.4459309 5.4 61.4 908 630 204.8 13.3
# data summary
knitr::kable(summary(human)) %>% 
  kable_styling(bootstrap_options = "striped", position = "center", font_size = 11)
ratio.sec.edu ratio.lab.force edu.expect life.exp GNI mat.mor.r adol.birth rep.parliament
Min. :0.1717 Min. :0.1857 Min. : 5.40 Min. :49.00 Min. : 581 Min. : 1.0 Min. : 0.60 Min. : 0.00
1st Qu.:0.7264 1st Qu.:0.5984 1st Qu.:11.25 1st Qu.:66.30 1st Qu.: 4198 1st Qu.: 11.5 1st Qu.: 12.65 1st Qu.:12.40
Median :0.9375 Median :0.7535 Median :13.50 Median :74.20 Median : 12040 Median : 49.0 Median : 33.60 Median :19.30
Mean :0.8529 Mean :0.7074 Mean :13.18 Mean :71.65 Mean : 17628 Mean : 149.1 Mean : 47.16 Mean :20.91
3rd Qu.:0.9968 3rd Qu.:0.8535 3rd Qu.:15.20 3rd Qu.:77.25 3rd Qu.: 24512 3rd Qu.: 190.0 3rd Qu.: 71.95 3rd Qu.:27.95
Max. :1.4967 Max. :1.0380 Max. :20.20 Max. :83.50 Max. :123124 Max. :1100.0 Max. :204.80 Max. :57.50

1.2.1. Graphical overview of the human data set

# visualization of human data set
ov_human <- ggpairs(human, mapping = aes(), title ="Overview of the human data set", 
                     lower = list(combo = wrap("facethist", bins = 20)), 
                     upper = list(continuous = wrap("cor", size = 3)))
ov_human

The overview plot shows the data distributions of all variables in the data set and its correlations to each other. Most of the data is basically normally distributed. GNI, maternal mortality rate and adolescence birth rate show left skewed distribution representing a increased cound on low rates. On the upper part the correlations of the variables are shown as number values. Following you see the correlation matrix and a correlation plot presenting the values graphically.

# calculate the correlation matrix and round it
cor_human <- cor(human) %>% round(digits = 2)

cor_human %>% knitr::kable(caption = "Correlation table of human data set") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Correlation table of human data set
ratio.sec.edu ratio.lab.force edu.expect life.exp GNI mat.mor.r adol.birth rep.parliament
ratio.sec.edu 1.00 0.01 0.59 0.58 0.43 -0.66 -0.53 0.08
ratio.lab.force 0.01 1.00 0.05 -0.14 -0.02 0.24 0.12 0.25
edu.expect 0.59 0.05 1.00 0.79 0.62 -0.74 -0.70 0.21
life.exp 0.58 -0.14 0.79 1.00 0.63 -0.86 -0.73 0.17
GNI 0.43 -0.02 0.62 0.63 1.00 -0.50 -0.56 0.09
mat.mor.r -0.66 0.24 -0.74 -0.86 -0.50 1.00 0.76 -0.09
adol.birth -0.53 0.12 -0.70 -0.73 -0.56 0.76 1.00 -0.07
rep.parliament 0.08 0.25 0.21 0.17 0.09 -0.09 -0.07 1.00
# Specialized the insignificant value according to the significant level
p.mat <- cor.mtest(cor_human)$p

# visualize the correlation matrix
# correlations / colour shows the correlation values
corrplot(cor_human, method="pie", type="lower",  tl.cex = 0.65, p.mat = p.mat, sig.level = 0.01, tl.srt = 45, title="Correlations of the human data set", mar=c(0,0,1,0))  

This correlation plot shows nicely which variable corrate positively or negatively to each other. The crossed square show a correlation which is not significant (p-value > 0.01). Strong positive correlations are seen between live expectation and expected years of education, gross national income with expected years of education and live expectation and adolescent birth rate with maternal mortality. So a better education leads to a higher income and so a higher life expectation. Maternal mortality rate depends on the rate of adolescence birth rate.


1.3. Data Analysis

1.3.1. Principal component analysis (PCA) on the not standardized human data set

Generally: in a PCA the data is transformed into new features called the principal components. The first principal component (PC) captures the maximum amount of variance of the features in the original data. The second PC captures the maximum amount of variability left and is orthogonal to the first PC (in a right angle to the first PC). All PCAs are uncorrelated analysis methods. Here we are using PC1 and PC2 for the data interpretation.

Here the principal component analysis of the not standardized human data set, a table with the values of the principle components 1 & 2 and the PCA biplot.

# perform PCA on not standardized data set
pca_human <- prcomp(human)

# PCA results on PC1 and PC2 on all variables
round(pca_human$rotation[,1:2], digits = 4) %>% knitr::kable(caption = "PCA result on all variables") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
PCA result on all variables
PC1 PC2
ratio.sec.edu 0.0000 0.0007
ratio.lab.force 0.0000 -0.0003
edu.expect -0.0001 0.0076
life.exp -0.0003 0.0283
GNI -1.0000 -0.0058
mat.mor.r 0.0057 -0.9916
adol.birth 0.0012 -0.1256
rep.parliament -0.0001 0.0032
# summary of the pca
s1 <- summary(pca_human)

# round the percentages of variance captured by the pc
pca_prc1 <- round(100 * s1$importance[2, ], digits = 3) 

# Principal components 1 & 2 table
round(100 * s1$importance[2, 1:2], digits = 5) %>% knitr::kable(caption = "Principle components") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Principle components
x
PC1 99.99
PC2 0.01
# prepare a label for the biplot
pca_label1 <-  paste0("GNI (", pca_prc1[1],"%)" )
pca_label2 <-  paste0("Maternal mortality (", pca_prc1[2],"%)" )

# draw the pca biplot
biplot(pca_human, choices = 1:2, cex = c(0.55, 0,9), col = c("grey40", "darkgreen"), xlab = pca_label1, 
                      ylab = pca_label2, main = "PCA biplot of non standardized human data set", margin =c(0,0,5,0)) 

Let’s try another kind of biplot

# more advanced biplot
autoplot(pca_human, data = human, label= TRUE, label.size = 3.0, colour = "darkgreen", loadings = TRUE, loadings.label = TRUE, loadings.colour = "red",) + ggtitle("PCA biplot of not standardized human data set") + xlab(paste0(pca_label1))  + ylab(paste0(pca_label2)) + theme_bw()

Principal component 1 represents 99.99% of the variance of the data set. PC2 captures just 0.01% of the variance. Pc1 is associated with the GNI variable. So, the Gross National Income per capita explains 99.99% of the principal component 1 of the data. 0.01 % is associated with maternal mortalility rate. This is the not standardized data set, that’s why we get this result.


1.3.2. Principal component analysis (PCA) on the standardized human data set

First we are scaling the data set so that the human is standardized so that the mean of all values is 0 and the standard diviation reaches 1. See here the summary table of the standardized data set.

# scaling of human data set
human_stzd <- scale(human)

# summary of standardized human data set
knitr::kable(summary(human_stzd)) %>% 
  kable_styling(bootstrap_options = "striped", position = "center", font_size = 11)
ratio.sec.edu ratio.lab.force edu.expect life.exp GNI mat.mor.r adol.birth rep.parliament
Min. :-2.8189 Min. :-2.6247 Min. :-2.7378 Min. :-2.7188 Min. :-0.9193 Min. :-0.6992 Min. :-1.1325 Min. :-1.8203
1st Qu.:-0.5233 1st Qu.:-0.5484 1st Qu.:-0.6782 1st Qu.:-0.6425 1st Qu.:-0.7243 1st Qu.:-0.6496 1st Qu.:-0.8394 1st Qu.:-0.7409
Median : 0.3503 Median : 0.2316 Median : 0.1140 Median : 0.3056 Median :-0.3013 Median :-0.4726 Median :-0.3298 Median :-0.1403
Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000 Mean : 0.0000
3rd Qu.: 0.5958 3rd Qu.: 0.7350 3rd Qu.: 0.7126 3rd Qu.: 0.6717 3rd Qu.: 0.3712 3rd Qu.: 0.1932 3rd Qu.: 0.6030 3rd Qu.: 0.6127
Max. : 2.6646 Max. : 1.6632 Max. : 2.4730 Max. : 1.4218 Max. : 5.6890 Max. : 4.4899 Max. : 3.8344 Max. : 3.1850

In the summary table you can see that all mean values are 0. So the scaling of the human data set was successful.
Now the PCA of this standardized data follows.

# perform PCA on standardized data set
pca_human_stzd <- prcomp(human_stzd)

# Principal components 1 & 2 table
round(pca_human_stzd$rotation[, 1:2], digits = 4) %>% knitr::kable(caption = "PC1 & 2 of standardized human data") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
PC1 & 2 of standardized human data
PC1 PC2
ratio.sec.edu -0.3566 0.0380
ratio.lab.force 0.0546 0.7243
edu.expect -0.4277 0.1394
life.exp -0.4437 -0.0253
GNI -0.3505 0.0506
mat.mor.r 0.4370 0.1451
adol.birth 0.4113 0.0771
rep.parliament -0.0844 0.6514
# PC labels
s2 <- summary(pca_human_stzd)
pca_prc2 <- round(100 * s2$importance[2, ], digits = 1) 

pca_label2.1 <- paste0("Education and health (",pca_prc2[1],"%) ")
pca_label2.2 <- paste0("Female social participation (",pca_prc2[2],"%) ")

# Principal components 1 & 2 table
round(100 * s2$importance[2, 1:2], digits = 2) %>% knitr::kable(caption = "Principle components") %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
Principle components
x
PC1 53.61
PC2 16.24
# draw the PCA biplot
biplot(pca_human_stzd, choices = 1:2, cex = c(0.5, 0,9), col = c("grey40", "deeppink2"), xlab = pca_label2.1, ylab = pca_label2.2, main = "PCA biplot of standardized human data set", margin =c(0,0,5,0))

Also here another kind of biplot

# more advanced biplot
autoplot(prcomp(human_stzd), data = human, colour = "darkgreen", label= TRUE, label.size = 3.0, loadings = TRUE, loadings.label = TRUE, loadings.colour = "red",) + ggtitle("PCA biplot of standardized human data set") + xlab(paste0(pca_label2.1))  + ylab(paste0(pca_label2.2)) + theme_bw()

The principal component representing 16.2% of the data variation is strongly associated with parliament representatives and the labour force ratio (female / male). So the occurance of of woman in the labour market and in the political representation is mostly associated with principal component 2. The variables which show the strongest association with principal component 1 (representing 53.6% of the data variation) are life expectation, the school period, the ratio of female/male of second education, adolenscence birthrate and maternal mortality. So overall principal component 1 represents the health and education situation of the countries.

The PCA results differ a lot! To perform the PCA we need to make sure that the obsvervations are standardized, which means that the values are transformed in compareable values.
We can see that in the PCA of the non-standardized human data set the principal component 1 represents almost all varation of the data set - so the Gross National Income per capita is the value representing principle component 1.
The PCA of standardized values looks quite different. You can see that the PC1 & PC2 values spread all over the biplot and more variables are associated to the two principal components.

The angle between the arrows represents the original features of the data set and we can interpret it as the correlation between these features. Is the angle between the arrows small, then there is a positive correlation.
Also the angle between the feature and the principal component can be interpreted as the correlation between these two components. Again, if the angle is small there is a positive correlation. The length of the arrows is proportional to the standard deviation of the features.


2. Multiple Correspondence Analysis (MCA)

2.1. Tea consumption behaviour MCA

The “tea” data set includes data about tea consumption. Following, you see the MCA of the tea data set as it was performed on the DataCamp exercise.

2.1.1. Load the data set & check it

# load the package and the tea data set
library(FactoMineR)
data(tea)

# check the tea data set
str(tea)
## 'data.frame':    300 obs. of  36 variables:
##  $ breakfast       : Factor w/ 2 levels "breakfast","Not.breakfast": 1 1 2 2 1 2 1 2 1 1 ...
##  $ tea.time        : Factor w/ 2 levels "Not.tea time",..: 1 1 2 1 1 1 2 2 2 1 ...
##  $ evening         : Factor w/ 2 levels "evening","Not.evening": 2 2 1 2 1 2 2 1 2 1 ...
##  $ lunch           : Factor w/ 2 levels "lunch","Not.lunch": 2 2 2 2 2 2 2 2 2 2 ...
##  $ dinner          : Factor w/ 2 levels "dinner","Not.dinner": 2 2 1 1 2 1 2 2 2 2 ...
##  $ always          : Factor w/ 2 levels "always","Not.always": 2 2 2 2 1 2 2 2 2 2 ...
##  $ home            : Factor w/ 2 levels "home","Not.home": 1 1 1 1 1 1 1 1 1 1 ...
##  $ work            : Factor w/ 2 levels "Not.work","work": 1 1 2 1 1 1 1 1 1 1 ...
##  $ tearoom         : Factor w/ 2 levels "Not.tearoom",..: 1 1 1 1 1 1 1 1 1 2 ...
##  $ friends         : Factor w/ 2 levels "friends","Not.friends": 2 2 1 2 2 2 1 2 2 2 ...
##  $ resto           : Factor w/ 2 levels "Not.resto","resto": 1 1 2 1 1 1 1 1 1 1 ...
##  $ pub             : Factor w/ 2 levels "Not.pub","pub": 1 1 1 1 1 1 1 1 1 1 ...
##  $ Tea             : Factor w/ 3 levels "black","Earl Grey",..: 1 1 2 2 2 2 2 1 2 1 ...
##  $ How             : Factor w/ 4 levels "alone","lemon",..: 1 3 1 1 1 1 1 3 3 1 ...
##  $ sugar           : Factor w/ 2 levels "No.sugar","sugar": 2 1 1 2 1 1 1 1 1 1 ...
##  $ how             : Factor w/ 3 levels "tea bag","tea bag+unpackaged",..: 1 1 1 1 1 1 1 1 2 2 ...
##  $ where           : Factor w/ 3 levels "chain store",..: 1 1 1 1 1 1 1 1 2 2 ...
##  $ price           : Factor w/ 6 levels "p_branded","p_cheap",..: 4 6 6 6 6 3 6 6 5 5 ...
##  $ age             : int  39 45 47 23 48 21 37 36 40 37 ...
##  $ sex             : Factor w/ 2 levels "F","M": 2 1 1 2 2 2 2 1 2 2 ...
##  $ SPC             : Factor w/ 7 levels "employee","middle",..: 2 2 4 6 1 6 5 2 5 5 ...
##  $ Sport           : Factor w/ 2 levels "Not.sportsman",..: 2 2 2 1 2 2 2 2 2 1 ...
##  $ age_Q           : Factor w/ 5 levels "15-24","25-34",..: 3 4 4 1 4 1 3 3 3 3 ...
##  $ frequency       : Factor w/ 4 levels "1/day","1 to 2/week",..: 1 1 3 1 3 1 4 2 3 3 ...
##  $ escape.exoticism: Factor w/ 2 levels "escape-exoticism",..: 2 1 2 1 1 2 2 2 2 2 ...
##  $ spirituality    : Factor w/ 2 levels "Not.spirituality",..: 1 1 1 2 2 1 1 1 1 1 ...
##  $ healthy         : Factor w/ 2 levels "healthy","Not.healthy": 1 1 1 1 2 1 1 1 2 1 ...
##  $ diuretic        : Factor w/ 2 levels "diuretic","Not.diuretic": 2 1 1 2 1 2 2 2 2 1 ...
##  $ friendliness    : Factor w/ 2 levels "friendliness",..: 2 2 1 2 1 2 2 1 2 1 ...
##  $ iron.absorption : Factor w/ 2 levels "iron absorption",..: 2 2 2 2 2 2 2 2 2 2 ...
##  $ feminine        : Factor w/ 2 levels "feminine","Not.feminine": 2 2 2 2 2 2 2 1 2 2 ...
##  $ sophisticated   : Factor w/ 2 levels "Not.sophisticated",..: 1 1 1 2 1 1 1 2 2 1 ...
##  $ slimming        : Factor w/ 2 levels "No.slimming",..: 1 1 1 1 1 1 1 1 1 1 ...
##  $ exciting        : Factor w/ 2 levels "exciting","No.exciting": 2 1 2 2 2 2 2 2 2 2 ...
##  $ relaxing        : Factor w/ 2 levels "No.relaxing",..: 1 1 2 2 2 2 2 2 2 2 ...
##  $ effect.on.health: Factor w/ 2 levels "effect on health",..: 2 2 2 2 2 2 2 2 2 2 ...

2.1.2. Data overview

# column names to keep in the dataset
keep_tea <- c("Tea", "How", "how", "sugar", "where", "lunch") # from DataCamp

# select the 'keep_columns' to create a new dataset
tea_time <- select(tea, one_of(keep_tea))

gather(tea_time) %>% ggplot(aes(value)) + facet_wrap("key", scales = "free") + geom_bar(fill = "darkgreen") + theme(axis.text.x = element_text(angle = 45, hjust = 1, size = 8))


2.1.3. MCA

# multiple correspondence analysis
mca_tea_time <- MCA(tea_time, graph = FALSE)

# summary of the model
summary(mca_tea_time)
## 
## Call:
## MCA(X = tea_time, graph = FALSE) 
## 
## 
## Eigenvalues
##                        Dim.1   Dim.2   Dim.3   Dim.4   Dim.5   Dim.6
## Variance               0.279   0.261   0.219   0.189   0.177   0.156
## % of var.             15.238  14.232  11.964  10.333   9.667   8.519
## Cumulative % of var.  15.238  29.471  41.435  51.768  61.434  69.953
##                        Dim.7   Dim.8   Dim.9  Dim.10  Dim.11
## Variance               0.144   0.141   0.117   0.087   0.062
## % of var.              7.841   7.705   6.392   4.724   3.385
## Cumulative % of var.  77.794  85.500  91.891  96.615 100.000
## 
## Individuals (the 10 first)
##                       Dim.1    ctr   cos2    Dim.2    ctr   cos2    Dim.3
## 1                  | -0.298  0.106  0.086 | -0.328  0.137  0.105 | -0.327
## 2                  | -0.237  0.067  0.036 | -0.136  0.024  0.012 | -0.695
## 3                  | -0.369  0.162  0.231 | -0.300  0.115  0.153 | -0.202
## 4                  | -0.530  0.335  0.460 | -0.318  0.129  0.166 |  0.211
## 5                  | -0.369  0.162  0.231 | -0.300  0.115  0.153 | -0.202
## 6                  | -0.369  0.162  0.231 | -0.300  0.115  0.153 | -0.202
## 7                  | -0.369  0.162  0.231 | -0.300  0.115  0.153 | -0.202
## 8                  | -0.237  0.067  0.036 | -0.136  0.024  0.012 | -0.695
## 9                  |  0.143  0.024  0.012 |  0.871  0.969  0.435 | -0.067
## 10                 |  0.476  0.271  0.140 |  0.687  0.604  0.291 | -0.650
##                       ctr   cos2  
## 1                   0.163  0.104 |
## 2                   0.735  0.314 |
## 3                   0.062  0.069 |
## 4                   0.068  0.073 |
## 5                   0.062  0.069 |
## 6                   0.062  0.069 |
## 7                   0.062  0.069 |
## 8                   0.735  0.314 |
## 9                   0.007  0.003 |
## 10                  0.643  0.261 |
## 
## Categories (the 10 first)
##                        Dim.1     ctr    cos2  v.test     Dim.2     ctr
## black              |   0.473   3.288   0.073   4.677 |   0.094   0.139
## Earl Grey          |  -0.264   2.680   0.126  -6.137 |   0.123   0.626
## green              |   0.486   1.547   0.029   2.952 |  -0.933   6.111
## alone              |  -0.018   0.012   0.001  -0.418 |  -0.262   2.841
## lemon              |   0.669   2.938   0.055   4.068 |   0.531   1.979
## milk               |  -0.337   1.420   0.030  -3.002 |   0.272   0.990
## other              |   0.288   0.148   0.003   0.876 |   1.820   6.347
## tea bag            |  -0.608  12.499   0.483 -12.023 |  -0.351   4.459
## tea bag+unpackaged |   0.350   2.289   0.056   4.088 |   1.024  20.968
## unpackaged         |   1.958  27.432   0.523  12.499 |  -1.015   7.898
##                       cos2  v.test     Dim.3     ctr    cos2  v.test  
## black                0.003   0.929 |  -1.081  21.888   0.382 -10.692 |
## Earl Grey            0.027   2.867 |   0.433   9.160   0.338  10.053 |
## green                0.107  -5.669 |  -0.108   0.098   0.001  -0.659 |
## alone                0.127  -6.164 |  -0.113   0.627   0.024  -2.655 |
## lemon                0.035   3.226 |   1.329  14.771   0.218   8.081 |
## milk                 0.020   2.422 |   0.013   0.003   0.000   0.116 |
## other                0.102   5.534 |  -2.524  14.526   0.197  -7.676 |
## tea bag              0.161  -6.941 |  -0.065   0.183   0.006  -1.287 |
## tea bag+unpackaged   0.478  11.956 |   0.019   0.009   0.000   0.226 |
## unpackaged           0.141  -6.482 |   0.257   0.602   0.009   1.640 |
## 
## Categorical variables (eta2)
##                      Dim.1 Dim.2 Dim.3  
## Tea                | 0.126 0.108 0.410 |
## How                | 0.076 0.190 0.394 |
## how                | 0.708 0.522 0.010 |
## sugar              | 0.065 0.001 0.336 |
## where              | 0.702 0.681 0.055 |
## lunch              | 0.000 0.064 0.111 |
# visualize MCA
plot(mca_tea_time, invisible=c("ind"), habillage = "quali")


2.1.4. MCA, different biplot using “factoextra” package

Here the MCA presented in another MCA biplot, performed with the function ‘fviz_mca_biplot()’ which is available from the ‘factoextra’ package. Here I added as a individual value colour the cosine-squared (cos2), a value between 0 & 1. As closer to 1 this value gets as better it is projected on the dimension.

#draw a MCA biplot using factoextra function "fviz_mca_biplot"

fviz_mca_biplot(mca_tea_time, col.ind = "cos2", col.var = "red", label = "var", geom =  c("point","text"), labelsize = 4, arrows = c(FALSE,TRUE)) + labs(title = "MCA of tea consumption behaviour") + theme_grey() + theme(axis.line = element_line(size = 0.5), panel.background = element_rect(fill = "gray93"))


2.2. Another MCA analysis of different “tea” variables (reasons for drinking tea)

I want to know which reasons for drinking tea people have. So I have choosen following six variables to analyse by MCA:

  • sex –> Gender of the tea consumer
  • age_Q –> Age class of the tea consumer
  • spirituality –> Spiritual reason to drink tea
  • healthy –> Health reason to drink tea
  • slimming –> Tea consumer wants to loose weight or not
  • relaxing –> Relaxation effect on tea on consumer or not

I prepare a separate tea_reason data set and perform the MCA of that data aliquote.

2.2.1. Prepare the data set and overview graph

# column names to keep for my analysis
keep_reason <- c("sex", "age_Q", "spirituality", "healthy", "slimming", "relaxing")

# select the 'keep_columns' to create a new dataset
tea_reason <- select(tea, one_of(keep_reason))

gather(tea_reason) %>% ggplot(aes(value)) + facet_wrap("key", scales = "free") + geom_bar(fill = "darkviolet", alpha = 0.6) + theme(axis.text.x = element_text(angle = 45, hjust = 1, size = 8))

2.2.2. MCA of the “tea_reason” data

# multiple correspondence analysis
mca_tea_reason <- MCA(tea_reason, graph = FALSE)

# summary of the model
summary(mca_tea_reason)
## 
## Call:
## MCA(X = tea_reason, graph = FALSE) 
## 
## 
## Eigenvalues
##                        Dim.1   Dim.2   Dim.3   Dim.4   Dim.5   Dim.6
## Variance               0.233   0.217   0.202   0.173   0.166   0.138
## % of var.             15.562  14.476  13.456  11.502  11.066   9.187
## Cumulative % of var.  15.562  30.039  43.495  54.997  66.063  75.251
##                        Dim.7   Dim.8   Dim.9
## Variance               0.132   0.126   0.113
## % of var.              8.833   8.393   7.523
## Cumulative % of var.  84.083  92.477 100.000
## 
## Individuals (the 10 first)
##                     Dim.1    ctr   cos2    Dim.2    ctr   cos2    Dim.3
## 1                |  0.491  0.344  0.135 |  0.199  0.061  0.022 |  0.110
## 2                |  0.506  0.366  0.210 |  0.599  0.551  0.294 | -0.564
## 3                |  0.039  0.002  0.001 |  0.353  0.191  0.119 | -0.380
## 4                | -0.602  0.518  0.306 | -0.531  0.432  0.237 |  0.178
## 5                |  0.208  0.062  0.024 | -0.446  0.305  0.112 | -0.116
## 6                | -0.270  0.104  0.081 | -0.288  0.128  0.093 |  0.198
## 7                |  0.024  0.001  0.000 | -0.047  0.003  0.001 |  0.294
## 8                | -0.193  0.053  0.025 |  0.146  0.033  0.014 | -0.211
## 9                |  0.308  0.135  0.049 | -0.410  0.259  0.088 |  0.073
## 10               |  0.024  0.001  0.000 | -0.047  0.003  0.001 |  0.294
##                     ctr   cos2  
## 1                 0.020  0.007 |
## 2                 0.526  0.261 |
## 3                 0.239  0.138 |
## 4                 0.052  0.027 |
## 5                 0.022  0.008 |
## 6                 0.065  0.044 |
## 7                 0.143  0.054 |
## 8                 0.074  0.030 |
## 9                 0.009  0.003 |
## 10                0.143  0.054 |
## 
## Categories (the 10 first)
##                      Dim.1     ctr    cos2  v.test     Dim.2     ctr
## F                |  -0.256   2.767   0.095  -5.338 |   0.220   2.198
## M                |   0.373   4.037   0.095   5.338 |  -0.321   3.208
## 15-24            |  -0.834  15.237   0.308  -9.593 |  -0.524   6.472
## 25-34            |   0.681   7.619   0.139   6.437 |  -0.744   9.760
## 35-44            |   0.018   0.003   0.000   0.121 |   0.149   0.229
## 45-59            |   0.692   6.948   0.122   6.043 |   0.728   8.276
## +60              |  -0.346   1.086   0.017  -2.282 |   1.293  16.262
## Not.spirituality |   0.302   4.458   0.199   7.719 |   0.212   2.374
## spirituality     |  -0.661   9.770   0.199  -7.719 |  -0.465   5.203
## healthy          |  -0.247   3.050   0.142  -6.525 |   0.304   4.981
##                     cos2  v.test     Dim.3     ctr    cos2  v.test  
## F                  0.070   4.589 |  -0.555  15.066   0.449 -11.582 |
## M                  0.070  -4.589 |   0.809  21.981   0.449  11.582 |
## 15-24              0.122  -6.030 |  -0.522   6.893   0.120  -6.000 |
## 25-34              0.165  -7.027 |   0.983  18.367   0.289   9.294 |
## 35-44              0.003   1.014 |  -0.261   0.750   0.010  -1.770 |
## 45-59              0.135   6.361 |  -0.716   8.605   0.131  -6.254 |
## +60                0.243   8.517 |   0.901   8.498   0.118   5.936 |
## Not.spirituality   0.099   5.433 |   0.017   0.016   0.001   0.432 |
## spirituality       0.099  -5.433 |  -0.037   0.035   0.001  -0.432 |
## healthy            0.216   8.043 |   0.179   1.853   0.075   4.730 |
## 
## Categorical variables (eta2)
##                    Dim.1 Dim.2 Dim.3  
## sex              | 0.095 0.070 0.449 |
## age_Q            | 0.433 0.534 0.522 |
## spirituality     | 0.199 0.099 0.001 |
## healthy          | 0.142 0.216 0.075 |
## slimming         | 0.100 0.272 0.107 |
## relaxing         | 0.431 0.111 0.058 |
# draw the MCA biplot
fviz_mca_biplot(mca_tea_reason, col.ind = "cos2", col.var = "red", alpha.var = "contrib", label = "var", geom =  c("point","text"), labelsize = 3.5, arrows = c(FALSE,TRUE)) + labs(title = "MCA of tea consumption reasons") + theme_grey() + theme(axis.line = element_line(size = 0.5), panel.background = element_rect(fill = "gray93"))

The biplot shows interesting results. Variable categories which are very close show higher similarity. The plot includes two colour schemes representing the “cos2” of the individuals (as closer to 1 as better it is projected on the dimension) and the contribution of the variables to the dimenstions(“contrib”). The biplot gives following information:

  • Men in the age of 25-34 drink tea not for health or weight loss reasons
  • People in the age class of 45-59 show similarities with no relaxation and no spirituality reason for drinking tea
  • People in the age class of 15-24, spirituality and relaxation reason are close and thereby similar
  • Women & age class of 60+ have similarities with drinking tea for health and weight loosing reasons


Chapter 6: Analysis of longitudinal data

Data wrangling and performing an analysis of longitudinal data

Work of week 49 (02.12. - 08.12.2019)


The data wrangling of two data sets (bprs & rats) was performed before. The script of the data wrangling exercise can be found here.

1. Load and check the data sets

1.1. Load the wide (bprs & rats) and long (bprs_l & rats_l) data sets

# load necessary packages
library(tidyr)
library(dplyr)
library(corrplot)
library(ggplot2)
library(ggthemes)
library(GGally)
library(knitr)
library(kableExtra)
library(stringr)
library(lme4)
# load the wide data sets
bprs <- read.table("https://raw.githubusercontent.com/KimmoVehkalahti/MABS/master/Examples/data/BPRS.txt", sep = " ", header = TRUE)

rats <- read.table("https://raw.githubusercontent.com/KimmoVehkalahti/MABS/master/Examples/data/rats.txt", header = TRUE)

# load the long data sets
bprs_l <- read.table("C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/bprs_l.txt")
rats_l <- read.table("C:/Users/richla/OneDrive/1 C - R-Folder/11-IODS-course/IODS-project/data/rats_l.txt")

# check the data sets
str(bprs_l) 
## 'data.frame':    360 obs. of  5 variables:
##  $ treatment: int  1 1 1 1 1 1 1 1 1 1 ...
##  $ subject  : int  1 2 3 4 5 6 7 8 9 10 ...
##  $ weeks    : Factor w/ 9 levels "week0","week1",..: 1 1 1 1 1 1 1 1 1 1 ...
##  $ bprs     : int  42 58 54 55 72 48 71 30 41 57 ...
##  $ week     : int  0 0 0 0 0 0 0 0 0 0 ...
str(rats_l)
## 'data.frame':    176 obs. of  5 variables:
##  $ ID    : int  1 2 3 4 5 6 7 8 9 10 ...
##  $ Group : int  1 1 1 1 1 1 1 1 2 2 ...
##  $ WD    : Factor w/ 11 levels "WD1","WD15","WD22",..: 1 1 1 1 1 1 1 1 1 1 ...
##  $ Weight: int  240 225 245 260 255 260 275 245 410 405 ...
##  $ Time  : int  1 1 1 1 1 1 1 1 1 1 ...

The data sets show that the categorical variables “treatmeant” and “subject” in the BPRS data set and the categorical variables “Group” and “ID” in the RATS data set were read as numbers. So I factorized them again. See the code below.

# bprs data --> factor treatment & subject
bprs_l$treatment <- factor(bprs_l$treatment)
bprs_l$subject <- factor(bprs_l$subject)

# rats data --> factor ID & Group
rats_l$ID <- factor(rats_l$ID)
rats_l$Group <- factor(rats_l$Group)

# check the data again
str(bprs_l)
## 'data.frame':    360 obs. of  5 variables:
##  $ treatment: Factor w/ 2 levels "1","2": 1 1 1 1 1 1 1 1 1 1 ...
##  $ subject  : Factor w/ 20 levels "1","2","3","4",..: 1 2 3 4 5 6 7 8 9 10 ...
##  $ weeks    : Factor w/ 9 levels "week0","week1",..: 1 1 1 1 1 1 1 1 1 1 ...
##  $ bprs     : int  42 58 54 55 72 48 71 30 41 57 ...
##  $ week     : int  0 0 0 0 0 0 0 0 0 0 ...
str(rats_l)
## 'data.frame':    176 obs. of  5 variables:
##  $ ID    : Factor w/ 16 levels "1","2","3","4",..: 1 2 3 4 5 6 7 8 9 10 ...
##  $ Group : Factor w/ 3 levels "1","2","3": 1 1 1 1 1 1 1 1 2 2 ...
##  $ WD    : Factor w/ 11 levels "WD1","WD15","WD22",..: 1 1 1 1 1 1 1 1 1 1 ...
##  $ Weight: int  240 225 245 260 255 260 275 245 410 405 ...
##  $ Time  : int  1 1 1 1 1 1 1 1 1 1 ...

1.2.The long data sets (bprs_l & rats_l)

The BPRS data set:
The BPRS (brief psychiatric rating scale) was measured on 40 male subjects with 2 treatment groups. One BPRS measurement was done before treatment started (week0), then weekly measurements followed up to 8 weeks in total. The BPRS assesses according to Vehkalahti & Everitt, 2019 the level of 18 symptom constructs which are used to evaluate if patients have schizophrenia. The BRRS symptoms constructs are measured in rates from 1 (not present) to 7 (extremely severe). The “long” BPRS data set bprs_l consists of 5 variables with 360 observations. These variables include:

  • treatment - the psychological treatment the male subjects (treatment 1 & 2)
  • subject - the male individuals identification number
  • weeks - the factor variable showing the week of treatment and BPRS evaluation
  • bprs - the bprs values
  • week - the week number of the BPRS evaluation
# bprs_l data set
knitr::kable(bprs_l, caption = "BPRS - long data set") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% 
  scroll_box(height = "300px")
BPRS - long data set
treatment subject weeks bprs week
1 1 week0 42 0
1 2 week0 58 0
1 3 week0 54 0
1 4 week0 55 0
1 5 week0 72 0
1 6 week0 48 0
1 7 week0 71 0
1 8 week0 30 0
1 9 week0 41 0
1 10 week0 57 0
1 11 week0 30 0
1 12 week0 55 0
1 13 week0 36 0
1 14 week0 38 0
1 15 week0 66 0
1 16 week0 41 0
1 17 week0 45 0
1 18 week0 39 0
1 19 week0 24 0
1 20 week0 38 0
2 1 week0 52 0
2 2 week0 30 0
2 3 week0 65 0
2 4 week0 37 0
2 5 week0 59 0
2 6 week0 30 0
2 7 week0 69 0
2 8 week0 62 0
2 9 week0 38 0
2 10 week0 65 0
2 11 week0 78 0
2 12 week0 38 0
2 13 week0 63 0
2 14 week0 40 0
2 15 week0 40 0
2 16 week0 54 0
2 17 week0 33 0
2 18 week0 28 0
2 19 week0 52 0
2 20 week0 47 0
1 1 week1 36 1
1 2 week1 68 1
1 3 week1 55 1
1 4 week1 77 1
1 5 week1 75 1
1 6 week1 43 1
1 7 week1 61 1
1 8 week1 36 1
1 9 week1 43 1
1 10 week1 51 1
1 11 week1 34 1
1 12 week1 52 1
1 13 week1 32 1
1 14 week1 35 1
1 15 week1 68 1
1 16 week1 35 1
1 17 week1 38 1
1 18 week1 35 1
1 19 week1 28 1
1 20 week1 34 1
2 1 week1 73 1
2 2 week1 23 1
2 3 week1 31 1
2 4 week1 31 1
2 5 week1 67 1
2 6 week1 33 1
2 7 week1 52 1
2 8 week1 54 1
2 9 week1 40 1
2 10 week1 44 1
2 11 week1 95 1
2 12 week1 41 1
2 13 week1 65 1
2 14 week1 37 1
2 15 week1 36 1
2 16 week1 45 1
2 17 week1 41 1
2 18 week1 30 1
2 19 week1 43 1
2 20 week1 36 1
1 1 week2 36 2
1 2 week2 61 2
1 3 week2 41 2
1 4 week2 49 2
1 5 week2 72 2
1 6 week2 41 2
1 7 week2 47 2
1 8 week2 38 2
1 9 week2 39 2
1 10 week2 51 2
1 11 week2 34 2
1 12 week2 49 2
1 13 week2 36 2
1 14 week2 36 2
1 15 week2 65 2
1 16 week2 45 2
1 17 week2 46 2
1 18 week2 27 2
1 19 week2 31 2
1 20 week2 27 2
2 1 week2 42 2
2 2 week2 32 2
2 3 week2 33 2
2 4 week2 27 2
2 5 week2 58 2
2 6 week2 37 2
2 7 week2 41 2
2 8 week2 49 2
2 9 week2 38 2
2 10 week2 31 2
2 11 week2 75 2
2 12 week2 36 2
2 13 week2 60 2
2 14 week2 31 2
2 15 week2 55 2
2 16 week2 35 2
2 17 week2 30 2
2 18 week2 29 2
2 19 week2 26 2
2 20 week2 32 2
1 1 week3 43 3
1 2 week3 55 3
1 3 week3 38 3
1 4 week3 54 3
1 5 week3 65 3
1 6 week3 38 3
1 7 week3 30 3
1 8 week3 38 3
1 9 week3 35 3
1 10 week3 55 3
1 11 week3 41 3
1 12 week3 54 3
1 13 week3 31 3
1 14 week3 34 3
1 15 week3 49 3
1 16 week3 42 3
1 17 week3 38 3
1 18 week3 25 3
1 19 week3 28 3
1 20 week3 25 3
2 1 week3 41 3
2 2 week3 24 3
2 3 week3 28 3
2 4 week3 31 3
2 5 week3 61 3
2 6 week3 33 3
2 7 week3 33 3
2 8 week3 39 3
2 9 week3 27 3
2 10 week3 34 3
2 11 week3 76 3
2 12 week3 27 3
2 13 week3 53 3
2 14 week3 38 3
2 15 week3 55 3
2 16 week3 27 3
2 17 week3 32 3
2 18 week3 33 3
2 19 week3 27 3
2 20 week3 29 3
1 1 week4 41 4
1 2 week4 43 4
1 3 week4 43 4
1 4 week4 56 4
1 5 week4 50 4
1 6 week4 36 4
1 7 week4 27 4
1 8 week4 31 4
1 9 week4 28 4
1 10 week4 53 4
1 11 week4 36 4
1 12 week4 48 4
1 13 week4 25 4
1 14 week4 25 4
1 15 week4 36 4
1 16 week4 31 4
1 17 week4 40 4
1 18 week4 29 4
1 19 week4 29 4
1 20 week4 25 4
2 1 week4 39 4
2 2 week4 20 4
2 3 week4 22 4
2 4 week4 31 4
2 5 week4 49 4
2 6 week4 28 4
2 7 week4 34 4
2 8 week4 55 4
2 9 week4 31 4
2 10 week4 39 4
2 11 week4 66 4
2 12 week4 29 4
2 13 week4 52 4
2 14 week4 35 4
2 15 week4 42 4
2 16 week4 25 4
2 17 week4 46 4
2 18 week4 30 4
2 19 week4 24 4
2 20 week4 25 4
1 1 week5 40 5
1 2 week5 34 5
1 3 week5 28 5
1 4 week5 50 5
1 5 week5 39 5
1 6 week5 29 5
1 7 week5 40 5
1 8 week5 26 5
1 9 week5 22 5
1 10 week5 43 5
1 11 week5 36 5
1 12 week5 43 5
1 13 week5 25 5
1 14 week5 27 5
1 15 week5 32 5
1 16 week5 31 5
1 17 week5 33 5
1 18 week5 28 5
1 19 week5 21 5
1 20 week5 27 5
2 1 week5 38 5
2 2 week5 20 5
2 3 week5 25 5
2 4 week5 26 5
2 5 week5 38 5
2 6 week5 26 5
2 7 week5 37 5
2 8 week5 51 5
2 9 week5 24 5
2 10 week5 34 5
2 11 week5 64 5
2 12 week5 27 5
2 13 week5 32 5
2 14 week5 30 5
2 15 week5 30 5
2 16 week5 22 5
2 17 week5 43 5
2 18 week5 26 5
2 19 week5 32 5
2 20 week5 23 5
1 1 week6 38 6
1 2 week6 28 6
1 3 week6 29 6
1 4 week6 47 6
1 5 week6 32 6
1 6 week6 33 6
1 7 week6 30 6
1 8 week6 26 6
1 9 week6 20 6
1 10 week6 43 6
1 11 week6 38 6
1 12 week6 37 6
1 13 week6 21 6
1 14 week6 25 6
1 15 week6 27 6
1 16 week6 29 6
1 17 week6 27 6
1 18 week6 21 6
1 19 week6 22 6
1 20 week6 21 6
2 1 week6 43 6
2 2 week6 19 6
2 3 week6 24 6
2 4 week6 24 6
2 5 week6 37 6
2 6 week6 27 6
2 7 week6 37 6
2 8 week6 55 6
2 9 week6 22 6
2 10 week6 41 6
2 11 week6 64 6
2 12 week6 21 6
2 13 week6 37 6
2 14 week6 33 6
2 15 week6 26 6
2 16 week6 22 6
2 17 week6 43 6
2 18 week6 36 6
2 19 week6 21 6
2 20 week6 23 6
1 1 week7 47 7
1 2 week7 28 7
1 3 week7 25 7
1 4 week7 42 7
1 5 week7 38 7
1 6 week7 27 7
1 7 week7 31 7
1 8 week7 25 7
1 9 week7 23 7
1 10 week7 39 7
1 11 week7 36 7
1 12 week7 36 7
1 13 week7 19 7
1 14 week7 26 7
1 15 week7 30 7
1 16 week7 26 7
1 17 week7 31 7
1 18 week7 25 7
1 19 week7 23 7
1 20 week7 19 7
2 1 week7 62 7
2 2 week7 18 7
2 3 week7 31 7
2 4 week7 26 7
2 5 week7 36 7
2 6 week7 23 7
2 7 week7 38 7
2 8 week7 59 7
2 9 week7 21 7
2 10 week7 42 7
2 11 week7 60 7
2 12 week7 22 7
2 13 week7 52 7
2 14 week7 30 7
2 15 week7 30 7
2 16 week7 22 7
2 17 week7 43 7
2 18 week7 33 7
2 19 week7 21 7
2 20 week7 23 7
1 1 week8 51 8
1 2 week8 28 8
1 3 week8 24 8
1 4 week8 46 8
1 5 week8 32 8
1 6 week8 25 8
1 7 week8 31 8
1 8 week8 24 8
1 9 week8 21 8
1 10 week8 32 8
1 11 week8 36 8
1 12 week8 31 8
1 13 week8 22 8
1 14 week8 26 8
1 15 week8 37 8
1 16 week8 30 8
1 17 week8 27 8
1 18 week8 20 8
1 19 week8 22 8
1 20 week8 21 8
2 1 week8 50 8
2 2 week8 20 8
2 3 week8 32 8
2 4 week8 23 8
2 5 week8 35 8
2 6 week8 21 8
2 7 week8 35 8
2 8 week8 66 8
2 9 week8 21 8
2 10 week8 39 8
2 11 week8 75 8
2 12 week8 23 8
2 13 week8 28 8
2 14 week8 27 8
2 15 week8 37 8
2 16 week8 22 8
2 17 week8 43 8
2 18 week8 30 8
2 19 week8 21 8
2 20 week8 23 8

The RATS data set:
This data consists of measurements on how rats grow. There were 3 groups of rats defined which got a different diet and the body weight of each animal was measured over a period of 9 weeks. The research question is if the growth profiles of the different diet groups differ (Vehkalahti & Everitt ,2019). The “long” RATS data set rats_l consists of 5 variables with 176 observations. The variables include:

  • ID - the individual rat’s identification number
  • Group - the diet group (1, 2 & 3)
  • WD - the factorial value of the time (the day when the rats’ weights were measured)
  • Weight - weight of each individual rat at the given day
  • Time - the time point of the weight measurement (days)
# rats_ data set
knitr::kable(rats_l, caption = "RATS - long data set") %>% 
  kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% 
  scroll_box(height = "300px")
RATS - long data set
ID Group WD Weight Time
1 1 WD1 240 1
2 1 WD1 225 1
3 1 WD1 245 1
4 1 WD1 260 1
5 1 WD1 255 1
6 1 WD1 260 1
7 1 WD1 275 1
8 1 WD1 245 1
9 2 WD1 410 1
10 2 WD1 405 1
11 2 WD1 445 1
12 2 WD1 555 1
13 3 WD1 470 1
14 3 WD1 535 1
15 3 WD1 520 1
16 3 WD1 510 1
1 1 WD8 250 8
2 1 WD8 230 8
3 1 WD8 250 8
4 1 WD8 255 8
5 1 WD8 260 8
6 1 WD8 265 8
7 1 WD8 275 8
8 1 WD8 255 8
9 2 WD8 415 8
10 2 WD8 420 8
11 2 WD8 445 8
12 2 WD8 560 8
13 3 WD8 465 8
14 3 WD8 525 8
15 3 WD8 525 8
16 3 WD8 510 8
1 1 WD15 255 15
2 1 WD15 230 15
3 1 WD15 250 15
4 1 WD15 255 15
5 1 WD15 255 15
6 1 WD15 270 15
7 1 WD15 260 15
8 1 WD15 260 15
9 2 WD15 425 15
10 2 WD15 430 15
11 2 WD15 450 15
12 2 WD15 565 15
13 3 WD15 475 15
14 3 WD15 530 15
15 3 WD15 530 15
16 3 WD15 520 15
1 1 WD22 260 22
2 1 WD22 232 22
3 1 WD22 255 22
4 1 WD22 265 22
5 1 WD22 270 22
6 1 WD22 275 22
7 1 WD22 270 22
8 1 WD22 268 22
9 2 WD22 428 22
10 2 WD22 440 22
11 2 WD22 452 22
12 2 WD22 580 22
13 3 WD22 485 22
14 3 WD22 533 22
15 3 WD22 540 22
16 3 WD22 515 22
1 1 WD29 262 29
2 1 WD29 240 29
3 1 WD29 262 29
4 1 WD29 265 29
5 1 WD29 270 29
6 1 WD29 275 29
7 1 WD29 273 29
8 1 WD29 270 29
9 2 WD29 438 29
10 2 WD29 448 29
11 2 WD29 455 29
12 2 WD29 590 29
13 3 WD29 487 29
14 3 WD29 535 29
15 3 WD29 543 29
16 3 WD29 530 29
1 1 WD36 258 36
2 1 WD36 240 36
3 1 WD36 265 36
4 1 WD36 268 36
5 1 WD36 273 36
6 1 WD36 277 36
7 1 WD36 274 36
8 1 WD36 265 36
9 2 WD36 443 36
10 2 WD36 460 36
11 2 WD36 455 36
12 2 WD36 597 36
13 3 WD36 493 36
14 3 WD36 540 36
15 3 WD36 546 36
16 3 WD36 538 36
1 1 WD43 266 43
2 1 WD43 243 43
3 1 WD43 267 43
4 1 WD43 270 43
5 1 WD43 274 43
6 1 WD43 278 43
7 1 WD43 276 43
8 1 WD43 265 43
9 2 WD43 442 43
10 2 WD43 458 43
11 2 WD43 451 43
12 2 WD43 595 43
13 3 WD43 493 43
14 3 WD43 525 43
15 3 WD43 538 43
16 3 WD43 535 43
1 1 WD44 266 44
2 1 WD44 244 44
3 1 WD44 267 44
4 1 WD44 272 44
5 1 WD44 273 44
6 1 WD44 278 44
7 1 WD44 271 44
8 1 WD44 267 44
9 2 WD44 446 44
10 2 WD44 464 44
11 2 WD44 450 44
12 2 WD44 595 44
13 3 WD44 504 44
14 3 WD44 530 44
15 3 WD44 544 44
16 3 WD44 542 44
1 1 WD50 265 50
2 1 WD50 238 50
3 1 WD50 264 50
4 1 WD50 274 50
5 1 WD50 276 50
6 1 WD50 284 50
7 1 WD50 282 50
8 1 WD50 273 50
9 2 WD50 456 50
10 2 WD50 475 50
11 2 WD50 462 50
12 2 WD50 612 50
13 3 WD50 507 50
14 3 WD50 543 50
15 3 WD50 553 50
16 3 WD50 550 50
1 1 WD57 272 57
2 1 WD57 247 57
3 1 WD57 268 57
4 1 WD57 273 57
5 1 WD57 278 57
6 1 WD57 279 57
7 1 WD57 281 57
8 1 WD57 274 57
9 2 WD57 468 57
10 2 WD57 484 57
11 2 WD57 466 57
12 2 WD57 618 57
13 3 WD57 518 57
14 3 WD57 544 57
15 3 WD57 555 57
16 3 WD57 553 57
1 1 WD64 278 64
2 1 WD64 245 64
3 1 WD64 269 64
4 1 WD64 275 64
5 1 WD64 280 64
6 1 WD64 281 64
7 1 WD64 284 64
8 1 WD64 278 64
9 2 WD64 478 64
10 2 WD64 496 64
11 2 WD64 472 64
12 2 WD64 628 64
13 3 WD64 525 64
14 3 WD64 559 64
15 3 WD64 548 64
16 3 WD64 569 64

2. Analysis of the long data sets (bprs_l & rats_l)

The data set analyses presented in the MABS book and the DataCamp platform are swapped. So we run the data analyses of MABS book chapter 8 on the “rats_l” data set and the analyses of MABS book chapter 9 on the “bprs_l” data set.

3. Analyses of chapter 8 of MABS using the RATS data (rats_l)

The long rats_l data will be analysed according to the instructions from chapter 8 of MABS. We start as a first step with a graphical overview of the data.

3.1. A first plot

# plot 1
ggplot(rats_l, aes(x = Time, y = Weight,linetype = ID)) +
  geom_line(color = "darkgreen") +
  scale_linetype_manual(values = rep(1:6, times= 3)) + # there are 16 IDs (6 linetypes are available - so repeated 3 times)
  facet_grid(. ~ Group, labeller = label_both) +
  theme(legend.position = "right") +
  scale_y_continuous(limits = c(min(rats_l$Weight), max(rats_l$Weight))) + 
  ggtitle("RATS: long data set ")

Here the rats_l data is plotted. The x-axis with the “Time” represents the days of the weight measurements, the y-axis shows the weights of the individual rats at the different measurement times. The plot is splitted into the three different diet groups. What can be seen in the graph is that the rat’s starting weights in group 2 & 3 were higher than in group 1 and that the weights in group 2 & 3 increased in quite high rates, while in group 1 it seems that the weights increased in a much smaller rate.
The data also shows the “tracking” phenomena (acc. MABS book, Vehkalahti & Everitt, 2019) where rats with a higher starting weight tend to have a higher weight increase over time. To make this tracking seen more clearly we standardise the rats_l data set.


3.2. Standardized data

The rats_l data set is standardised by calculating the “StdWeight”with following formular: \(StdWeight = (Weight - Mean(Weight)) / SD (Weight)\)
The calculated StdWeight is added as a new variable to the rats_l data set. Then the StdWeight is plotted.

# Standardise the variable Weight
rats_l <- rats_l %>%
  group_by(Time) %>%
  mutate(StdWeight = ((Weight - mean(Weight))/sd(Weight))) %>%
  ungroup()

# Glimpse the data
glimpse(rats_l)
## Observations: 176
## Variables: 6
## $ ID        <fct> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 1...
## $ Group     <fct> 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3, 1, 1...
## $ WD        <fct> WD1, WD1, WD1, WD1, WD1, WD1, WD1, WD1, WD1, WD1, WD...
## $ Weight    <int> 240, 225, 245, 260, 255, 260, 275, 245, 410, 405, 44...
## $ Time      <int> 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 8, 8...
## $ StdWeight <dbl> -1.0011429, -1.1203857, -0.9613953, -0.8421525, -0.8...
# Plot again with the standardised rats_l
ggplot(rats_l, aes(x = Time, y = StdWeight, linetype = ID)) +
  geom_line(color = "darkred") +
  scale_linetype_manual(values = rep(1:6, times=3)) +
  facet_grid(. ~ Group, labeller = label_both) +
  scale_y_continuous(name = "StdWeight - standardized Weight values") +
  ggtitle("RATS: standardized Weight values") + 
  theme(panel.grid.major.y = element_line(colour = "grey40"))

The “StdWeight”-plot shows the standardized weight values over the measurement period. Here the “tracking” (higher starting weights tent to increase more compared to lower weights and stay on a higher value for the measurement period) can be seen much better.


3.3. Summary graphs

First you see an overview of the rats weights over the whole measurement period with boxplots. The different diet groups are seen in different colors.

# add the Time as a categorical variable "Time1"
rats_l <- rats_l %>%
  mutate(Time1 = factor(rats_l$Time)) %>%
  ungroup()

# check the data 
str(rats_l)
## Classes 'tbl_df', 'tbl' and 'data.frame':    176 obs. of  7 variables:
##  $ ID       : Factor w/ 16 levels "1","2","3","4",..: 1 2 3 4 5 6 7 8 9 10 ...
##  $ Group    : Factor w/ 3 levels "1","2","3": 1 1 1 1 1 1 1 1 2 2 ...
##  $ WD       : Factor w/ 11 levels "WD1","WD15","WD22",..: 1 1 1 1 1 1 1 1 1 1 ...
##  $ Weight   : int  240 225 245 260 255 260 275 245 410 405 ...
##  $ Time     : int  1 1 1 1 1 1 1 1 1 1 ...
##  $ StdWeight: num  -1.001 -1.12 -0.961 -0.842 -0.882 ...
##  $ Time1    : Factor w/ 11 levels "1","8","15","22",..: 1 1 1 1 1 1 1 1 1 1 ...
# prepare a boxplot of the "rats_l" data set
ggplot(data = rats_l, aes(x = rats_l$Time1, y = Weight, fill = rats_l$Group)) +
  geom_boxplot() +
  ylab("Weight") + 
  xlab("Weight measurement time points [days]") +
  ggtitle("Rat weights over the measurment period") +
  scale_fill_discrete(name = "Diet group") +
  theme(legend.position = "right") 

Now, the sample number n_rats of the individually measured rat weights are calculated (in total 11 weight measurements, so n = 11). Then a new table rats_s is prepared, where the calculated mean weight of all rats & the standard error of each diet group is added as a variable. The standard error is calculated via following formular: \(SE = SD(Weight)/ \sqrt n\) This new data table is then presented in the next plot.

# Number of Time, baseline (Time = 1) included
n_rats <- rats_l$Time %>% unique() %>% length()

# Summary data with mean and standard error of rats_l by group and time 
rats_s <- rats_l %>%
  group_by(Group, Time) %>%
  summarise(mean = mean(Weight), se = (sd(Weight)/sqrt(n_rats))) %>%
  ungroup()

# Glimpse the data
str(rats_s)
## Classes 'tbl_df', 'tbl' and 'data.frame':    33 obs. of  4 variables:
##  $ Group: Factor w/ 3 levels "1","2","3": 1 1 1 1 1 1 1 1 1 1 ...
##  $ Time : int  1 8 15 22 29 36 43 44 50 57 ...
##  $ mean : num  251 255 254 262 265 ...
##  $ se   : num  4.59 3.95 3.46 4.1 3.33 ...
# Plot the mean profiles
ggplot(rats_s, aes(x = Time, y = mean, linetype = Group,  color = Group, shape = Group)) +
  geom_line(size = 0.6) +
  scale_linetype_manual(name = "Diet group", values = c(1,2,3)) +
  geom_point(size=1.5) +
  scale_shape_manual(name = "Diet group", values = c(16,17,18)) +
  scale_color_manual(name = "Diet group", values = c("red", "darkgreen", "blue")) +
  geom_errorbar(aes(ymin=mean-se, ymax=mean+se, linetype="1"), width=0.8) +
  theme(legend.position = "right") +
  scale_y_continuous(name = "Mean (Weight) +/- SE (Weight)") + 
  ggtitle("Mean weight profiles of the different diet groups")

This graph shows the rat weight development of the three different diet groups over the measurement period. Group 1 starts on a lower level and the weight increases just slightly (~250 to 280). Group 2 and group 3 weights start form a higher values and also the rats’s weights decrease on a higher level (in group 2 an increase of more that 50 (+/- of course…), in group 3 around 50 (+/- a smaller value)). What you can see is that the standard errors in group 2 & 3 are much higher, so there is a much bigger weight difference between the individual rats within the diet group.


3.4. Find the outlier

Here we will have now a look how the rat’s weight developed after the first measurement. Therefore we prepare a new data table rats_l10s where the weight data without time = 1 is summarised by calculating the mean value of weight per diet group.

# Create a summary data by Group and ID with mean as the summary variable (ignoring baseline Time 1).
rats_l10s <- rats_l %>%
  filter(Time > 1) %>%
  group_by(Group, ID) %>%
  summarise(mean=mean(Weight)) %>%
  ungroup()

# Glimpse the data
glimpse(rats_l10s)
## Observations: 16
## Variables: 3
## $ Group <fct> 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3
## $ ID    <fct> 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16
## $ mean  <dbl> 263.2, 238.9, 261.7, 267.2, 270.9, 276.2, 274.6, 267.5, ...

From this new data table we prepare a boxplot:

# Draw a boxplot of the mean versus group
ggplot(rats_l10s, aes(x = Group, y = mean, fill = Group)) +
  geom_boxplot() +
  stat_summary(fun.y = "mean", geom = "point", shape = 23, size = 3, fill = "black") +
  scale_y_continuous(name = "Mean (Weight) / Time 8-64 [days]") + 
  xlab("Diet group") + 
  scale_fill_discrete(name = "Diet group") +
  ggtitle("Mean weights of diet groups excluding the first measurement")

The boxplot shows the mean rat weights of each diet group of the measurements after the first measurement (day 1). The boxplots of all three diet groups actually show an outlier. These outliers can disturb conclusions from further comparisons of the diet groups. So we have to remove these outliers as I did it for each group seen in the code below. I filter out the outliers and create a new data table rats_l10s1 where the mean values without the outliers are collected. Then a new boxplot of the diet groups is prepared.

# Create a new data by filtering the outliers and adjust the ggplot code the draw the plot again with the new data
rats_l10s1 <- rats_l10s %>%
    filter(
      (mean > 250 & Group == 1) | 
      (mean < 550 & Group == 2) |
      (mean > 500 & Group == 3))  %>% 
      ungroup()

str(rats_l10s1)
## Classes 'tbl_df', 'tbl' and 'data.frame':    13 obs. of  3 variables:
##  $ Group: Factor w/ 3 levels "1","2","3": 1 1 1 1 1 1 1 2 2 2 ...
##  $ ID   : Factor w/ 16 levels "1","2","3","4",..: 1 3 4 5 6 7 8 9 10 11 ...
##  $ mean : num  263 262 267 271 276 ...
# Draw a boxplot of the mean versus group with the outlier filtered data
ggplot(rats_l10s1, aes(x = Group, y = mean, fill = Group)) +
  geom_boxplot() +
  stat_summary(fun.y = "mean", geom = "point", shape = 23, size = 3, fill = "black") +
  scale_y_continuous(name = "Mean (Weight) / Time 8-64 [days]") +
  xlab("Diet group") + 
  scale_fill_discrete(name = "Diet group") +
  ggtitle("Mean weights of diet groups excluding the first measurement")

Here we see the boxplot graph with the weights means over the measurement period 8 to 64 days without outlier values. With that data we continue to analyse the diet group differences.


3.5. T-test and ANOVA

We cannot perform a t-test between the groups since we have 3 groups and not 2 groups.

t.test(mean ~ rats_l10s2$Group == 2, paired = TRUE) 
# Two-sample t-test group 1 & 3
t.test(mean ~ Group == 3, data = rats_l10s1) 
# cannot be done on three groups

Now we continue with creating a new data table rats_l10s2where a so called “baseline” weight values are created from the first weigh measurement on day 1 (WD1). So each group weight baseline was defined. See the table below with the mean weight values of the whole measurement period and the starting weight

# Add the baseline from the original data as a new variable to the summary data
rats_l10s2 <- rats_l10s %>%
  mutate(baseline = rats$WD1)

# check the data table
rats_l10s2 %>% knitr::kable() %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% scroll_box(height = "300px")
Group ID mean baseline
1 1 263.2 240
1 2 238.9 225
1 3 261.7 245
1 4 267.2 260
1 5 270.9 255
1 6 276.2 260
1 7 274.6 275
1 8 267.5 245
2 9 443.9 410
2 10 457.5 405
2 11 455.8 445
2 12 594.0 555
3 13 495.2 470
3 14 536.4 535
3 15 542.2 520
3 16 536.2 510

Using this new data table with the mean weight values and the baseline weight values we perform an anova test to see which diet group had the highest weight increase over the measurement period. there are differences between the diet groups. In the following codes I run first a linear model comparing the group means to each other (from the data table rats_l10s2 excluding the starting weight measurement) and in the second linear model I compare the baseline weights of each group with the mean weight value of each group (also from the rats_l10s2 data table).

I continue now to compare the diet group differences with performing an ANOVA and a post-hoc test (TukeyHSD) to compare the group means, since I want to know if the diet groups are different from each other. The ANOVA is performed from the data table rats_l10s2. This is the data table without the first measurement (start value) and inluding the outlier values.

# linear regression model of mean values versus diet groups
rats_fit1 <- lm(mean ~ Group, data = rats_l10s2)
# regression model output
round(rats_fit1$coefficients, digits = 2) %>% knitr::kable() %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
x
(Intercept) 265.02
Group2 222.77
Group3 262.48

The model shows that the mean value of group 2 is 222g higher and the mean value of group 3 is 262g higher compared to group 1 mean weights. Let’s have a look now if the diet groups differ to each other. With a post-hoc test after the ANOVA we can check the group means.

# Compute the analysis of variance table of rats_fit1 <- lm(mean ~ Group, data = rats_l10s2) with anova()
rats_anova1 <- anova(rats_fit1)
rats_anova1
## Analysis of Variance Table
## 
## Response: mean
##           Df Sum Sq Mean Sq F value    Pr(>F)    
## Group      2 238620  119310  88.525 2.679e-08 ***
## Residuals 13  17521    1348                      
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

The ANOVA of the linear model of mean weight ~ diet group shows that group 2 has a significant different weight increase compared to group 1 & group 3.

# another kind of anova calculation --> compute the anova with aov()
rats_aov2 <- aov(mean ~ Group, data = rats_l10s2)
summary(rats_aov2)
##             Df Sum Sq Mean Sq F value   Pr(>F)    
## Group        2 238620  119310   88.53 2.68e-08 ***
## Residuals   13  17521    1348                     
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
# compare the anova results with tukeyHSD
TukeyHSD(rats_aov2, "Group", ordered = TRUE, conf.level = 0.95)
##   Tukey multiple comparisons of means
##     95% family-wise confidence level
##     factor levels have been ordered
## 
## Fit: aov(formula = mean ~ Group, data = rats_l10s2)
## 
## $Group
##        diff       lwr      upr     p adj
## 2-1 222.775 163.41452 282.1355 0.0000006
## 3-1 262.475 203.11452 321.8355 0.0000001
## 3-2  39.700 -28.84358 108.2436 0.3099506

Here we can see that the secon ANOVA function gives the same result and this result can be put in the TukeyHSD function. This post-hoc test gives us then the differences between the diet groups. Group 1 & 2 are significantly different in the mean weight. Group 1 & 3 are as well. Group 2 and 3 are not significantly different in the weight means (there is the outlier influence).

Now lets see what is the difference of each group to the baselines.

# Fit the linear model with the mean as the response 
rats_fit2 <- lm(mean ~ baseline + Group, data = rats_l10s2)

# regression model output
round(rats_fit2$coefficients, digits = 2) %>% knitr::kable() %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center")
x
(Intercept) 33.16
baseline 0.93
Group2 34.86
Group3 23.68

Here a regression model of the mean rats weights compared to the weight baseline (the first weight measurement of each dietary group). The results shows that the weight difference between baseline and group 2 is around 34g and group 3 and baseline is around 23g different. So group 2 has actually the highest weight increase rates over the measurement period.

# Compute the analysis of variance table of rats_fit2 <- lm(mean ~ baseline + Group, data = rats_l10s2) with anova()
rats_anova2 <- anova(rats_fit2)
rats_anova2
## Analysis of Variance Table
## 
## Response: mean
##           Df Sum Sq Mean Sq   F value   Pr(>F)    
## baseline   1 253625  253625 1859.8201 1.57e-14 ***
## Group      2    879     439    3.2219  0.07586 .  
## Residuals 12   1636     136                       
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

What is seen from the second ANOVA is that the baseline has a significant influence on the weight increase later in the different diet groups and that group 2 has a rather good significant weight increase relative to the baseline compared to group 1 and group 3. So it seems that diet 2 leads to the highest weight increase.


4. Analyses of Chapter 9 of MABS using the BPRS data (bprs_l)

4.1. First plots of the data

Here follows now the analysis of BPRS data set with linear mixed models. The BPRS data includes measurements over a period of 8 weeks with a starting value 0 (so all in all 9 BPRS determinations).

str(bprs_l)
## 'data.frame':    360 obs. of  5 variables:
##  $ treatment: Factor w/ 2 levels "1","2": 1 1 1 1 1 1 1 1 1 1 ...
##  $ subject  : Factor w/ 20 levels "1","2","3","4",..: 1 2 3 4 5 6 7 8 9 10 ...
##  $ weeks    : Factor w/ 9 levels "week0","week1",..: 1 1 1 1 1 1 1 1 1 1 ...
##  $ bprs     : int  42 58 54 55 72 48 71 30 41 57 ...
##  $ week     : int  0 0 0 0 0 0 0 0 0 0 ...
# overview plot
pairs(bprs_l, col = bprs_l$subject)

#plotting the bprs_l data 1
ggplot(bprs_l, aes(x = week, y = bprs, shape = subject, group = treatment)) +
  geom_point(color = "darkgreen") + 
  scale_shape_manual(values = rep(1:10, times = 2)) +
  scale_x_continuous(name = "Weeks", breaks = seq(0,8,1)) + 
  scale_y_continuous(name = "BPRS value") + 
  theme(legend.position = "bottom") + 
  ggtitle("BPRS: data values overview") +
  theme(legend.box.background = element_rect(),legend.box.margin = margin(2, 2, 2, 2))

#plotting the bprs_l data 2
ggplot(bprs_l, aes(x = week, y = bprs, linetype = subject)) +
  geom_line(color = "darkgreen") +
  scale_linetype_manual(values = rep(1:6, times = 4)) +
  facet_grid(. ~ treatment, labeller = label_both) +
  scale_x_continuous(name = "Weeks", breaks = seq(0,8,1)) + 
  scale_y_continuous(name = "BPRS values observed", breaks = seq(10,100,5)) + 
  theme(legend.position = "bottom") +
  ggtitle("BPRS: data values overview by treatment") +
  theme(legend.box.background = element_rect(),legend.box.margin = margin(2, 2, 2, 2))

#plotting the bprs_l data 3
ggplot(bprs_l, aes(x = week, y = bprs)) +
  geom_line(aes(linetype = treatment), color = "darkgreen") +
  scale_linetype_manual(values = rep(1:2, times = 1)) +
  facet_grid(.~ subject) +
  scale_x_continuous(name = "Weeks", breaks = seq(0,8,4)) + 
  scale_y_continuous(name = "BPRS values observed", breaks = seq(10,100,5)) + 
  theme(legend.position = "right") +
  ggtitle("BPRS: data values overview by subject") +
  theme(legend.box.background = element_rect(),legend.box.margin = margin(2, 2, 2, 2))

The overview plots give us a glimpse into the data. We have 40 subjects in two treatment groups (so 20 in each group). After a first BPRS determination (week 0) the treatment started and 8 more BPRS determinations followed.


4.2. Linear model of bprs data

We start with a first linear regression model with the BPRS value as dependent varibale and week + treatment as independent variables (explanatory variables).

# create a regression model bprs_l_reg
bprs_l_reg <- lm(bprs ~ week + treatment, data = bprs_l)

# print out a summary of the model
summary(bprs_l_reg)
## 
## Call:
## lm(formula = bprs ~ week + treatment, data = bprs_l)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -22.454  -8.965  -3.196   7.002  50.244 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  46.4539     1.3670  33.982   <2e-16 ***
## week         -2.2704     0.2524  -8.995   <2e-16 ***
## treatment2    0.5722     1.3034   0.439    0.661    
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 12.37 on 357 degrees of freedom
## Multiple R-squared:  0.1851, Adjusted R-squared:  0.1806 
## F-statistic: 40.55 on 2 and 357 DF,  p-value: < 2.2e-16

This linear regression gives as a results that the measurement week has a significant effect on the BPRS values. With each new BRPS determination every week the BRPS value decreases by 2.2 units. Treatment 2 could have a better effect on the BPRS values of the subjects, but it does not show a significant influence. But are the treatments really helping the patients and are there some effects on the model by the subjects (the patients) with influence the model - therefore we add the subjects as a random effect term into a random intercept model.


4.3. Random Intercept Model

Now we perform a random intercept model on the BPRS data. We add as explanatory variables the measurement periods “week” and the “treatment” and add as the random-effects term the subjects (the 40 men which udnergoing the treatment and measurements). This model will consider the linear regression fit for each subject to the other subjects.

# Create a random intercept model
bprs_l_rim <- lmer(bprs ~ week + treatment + (1 | subject), data = bprs_l, REML = FALSE)

# Print the summary of the model
summary(bprs_l_rim)
## Linear mixed model fit by maximum likelihood  ['lmerMod']
## Formula: bprs ~ week + treatment + (1 | subject)
##    Data: bprs_l
## 
##      AIC      BIC   logLik deviance df.resid 
##   2748.7   2768.1  -1369.4   2738.7      355 
## 
## Scaled residuals: 
##     Min      1Q  Median      3Q     Max 
## -3.0481 -0.6749 -0.1361  0.4813  3.4855 
## 
## Random effects:
##  Groups   Name        Variance Std.Dev.
##  subject  (Intercept)  47.41    6.885  
##  Residual             104.21   10.208  
## Number of obs: 360, groups:  subject, 20
## 
## Fixed effects:
##             Estimate Std. Error t value
## (Intercept)  46.4539     1.9090  24.334
## week         -2.2704     0.2084 -10.896
## treatment2    0.5722     1.0761   0.532
## 
## Correlation of Fixed Effects:
##            (Intr) week  
## week       -0.437       
## treatment2 -0.282  0.000

We see in the random intercept model summary that the standard error of the week (the measurement periods) is smaller compared to the standard error in the linear regression model (0.2084 compared to 0.2524) and also the standard error of the treatment 2 is smaller (1.0761 compared to 1.3034). That means that the subject effect was taken into account in this model. The t-value gives a value if the confidence interval of the BPRS values can be 0 - here that is not the case. Generally: the mixed model (lmer) takes into account the correlations between the observations (here between the subjects), so it is more accurate compared to the linear regression model. The p-value can be calculated by inverse function of the t-values (a function to invert the t value to the p value).


4.4. Random Intercept and Random Slope Model

Here we are fitting a random intercept and random slope model to the BPRS values. That allows the linear regression to fit for each subject (patient) to differ in the intercept and in the slope. So it will be possible to take the subject differences into accountof the BPRS values and also in the effect of the measurement period. So in the next model the “week” and the “subject” is included as a random-effect term.

# create a random intercept and random slope model
bprs_l_rim1 <- lmer(bprs ~ week + treatment + (week | subject), data = bprs_l, REML = FALSE)

# here the other times (weeks) are included - you get a regression model defined for each subject for each time period (for each week) - so here 9 (0-8 weeks) different regression models 

# print a summary of the model
summary(bprs_l_rim1)
## Linear mixed model fit by maximum likelihood  ['lmerMod']
## Formula: bprs ~ week + treatment + (week | subject)
##    Data: bprs_l
## 
##      AIC      BIC   logLik deviance df.resid 
##   2745.4   2772.6  -1365.7   2731.4      353 
## 
## Scaled residuals: 
##     Min      1Q  Median      3Q     Max 
## -2.8919 -0.6194 -0.0691  0.5531  3.7977 
## 
## Random effects:
##  Groups   Name        Variance Std.Dev. Corr 
##  subject  (Intercept) 64.8222  8.0512        
##           week         0.9609  0.9803   -0.51
##  Residual             97.4304  9.8707        
## Number of obs: 360, groups:  subject, 20
## 
## Fixed effects:
##             Estimate Std. Error t value
## (Intercept)  46.4539     2.1052  22.066
## week         -2.2704     0.2977  -7.626
## treatment2    0.5722     1.0405   0.550
## 
## Correlation of Fixed Effects:
##            (Intr) week  
## week       -0.582       
## treatment2 -0.247  0.000

The model results show not so much difference compared to the model before. To check difference in the models we perform an ANOVA.

# perform an ANOVA test on the two models
anova(bprs_l_rim1, bprs_l_rim)
## Data: bprs_l
## Models:
## bprs_l_rim: bprs ~ week + treatment + (1 | subject)
## bprs_l_rim1: bprs ~ week + treatment + (week | subject)
##             Df    AIC    BIC  logLik deviance  Chisq Chi Df Pr(>Chisq)  
## bprs_l_rim   5 2748.7 2768.1 -1369.4   2738.7                           
## bprs_l_rim1  7 2745.4 2772.6 -1365.7   2731.4 7.2721      2    0.02636 *
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

The ANOVA shows that the “Random Intercept and Random Slope Model” (taking week and subject into account) gives a better fit for the data (p-value of 0.02) with a significant outcome.


4.5. Random Intercept and Random Slope Model with interaction

Now we perform a “Random Intercept and Random Slope Model” where we include an interaction of the treatment and the time (weeks) to see if the treatment time period interacts with the BRPS value outcome.

# create a random intercept and random slope model
bprs_l_rim2 <- lmer(bprs ~ week * treatment + (week | subject), data = bprs_l, REML = FALSE)

# print a summary of the model
summary(bprs_l_rim2)
## Linear mixed model fit by maximum likelihood  ['lmerMod']
## Formula: bprs ~ week * treatment + (week | subject)
##    Data: bprs_l
## 
##      AIC      BIC   logLik deviance df.resid 
##   2744.3   2775.4  -1364.1   2728.3      352 
## 
## Scaled residuals: 
##     Min      1Q  Median      3Q     Max 
## -3.0512 -0.6271 -0.0768  0.5288  3.9260 
## 
## Random effects:
##  Groups   Name        Variance Std.Dev. Corr 
##  subject  (Intercept) 64.9964  8.0620        
##           week         0.9687  0.9842   -0.51
##  Residual             96.4707  9.8220        
## Number of obs: 360, groups:  subject, 20
## 
## Fixed effects:
##                 Estimate Std. Error t value
## (Intercept)      47.8856     2.2521  21.262
## week             -2.6283     0.3589  -7.323
## treatment2       -2.2911     1.9090  -1.200
## week:treatment2   0.7158     0.4010   1.785
## 
## Correlation of Fixed Effects:
##             (Intr) week   trtmn2
## week        -0.650              
## treatment2  -0.424  0.469       
## wek:trtmnt2  0.356 -0.559 -0.840
# perform an ANOVA test on the two models
anova(bprs_l_rim2, bprs_l_rim1)
## Data: bprs_l
## Models:
## bprs_l_rim1: bprs ~ week + treatment + (week | subject)
## bprs_l_rim2: bprs ~ week * treatment + (week | subject)
##             Df    AIC    BIC  logLik deviance  Chisq Chi Df Pr(>Chisq)  
## bprs_l_rim1  7 2745.4 2772.6 -1365.7   2731.4                           
## bprs_l_rim2  8 2744.3 2775.4 -1364.1   2728.3 3.1712      1    0.07495 .
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1

The ANOVA shows that the “Random Intercept and Random Slope Model” with the time * treatment interaction fits likely better on the data (p-value of 0.075 < 0.1). So there is an interaction of the treatments with the treament time.

Now, we check the BRPS data again on a graph, then we add fitted BPRS values created by the “Random Intercept and Random Slope Model” with the time * treatment interaction to the data table and draw the plot with the fitted BPRS values.

# draw the plot of bprs_l
ggplot(bprs_l, aes(x = week, y = bprs, linetype = subject)) +
  geom_line(color = "darkgreen") +
  scale_linetype_manual(values = rep(1:6, times = 4)) +
  facet_grid(. ~ treatment, labeller = label_both) +
  scale_x_continuous(name = "Weeks", breaks = seq(0,8,1)) + 
  scale_y_continuous(name = "BPRS values observed", breaks = seq(10,100,5)) + 
  theme(legend.position = "bottom") + 
  ggtitle("BPRS: original values by treatment") +
  theme(legend.box.background = element_rect(),legend.box.margin = margin(2, 2, 2, 2))

Now we calculate fitted BPRS values according to the latest model and add the fitted BRPS values to the data table. These fitted values are then plotted. The plot shows nicely the BPRS alue development according to the model.

# Create a vector of the fitted values
bprs_l_fit <- fitted(bprs_l_rim2)

# Create a new column "fitted_bprs" to bprs_l
bprs_l$fitted_bprs <- round(bprs_l_fit, digits = 2)

# Check the BRPS data table again
bprs_l %>% knitr::kable() %>% kable_styling(bootstrap_options = "striped", full_width = FALSE, position = "center") %>% scroll_box(height = "300px")
treatment subject weeks bprs week fitted_bprs
1 1 week0 42 0 49.24
1 2 week0 58 0 46.97
1 3 week0 54 0 47.66
1 4 week0 55 0 49.85
1 5 week0 72 0 66.39
1 6 week0 48 0 42.59
1 7 week0 71 0 54.58
1 8 week0 30 0 47.93
1 9 week0 41 0 42.08
1 10 week0 57 0 53.35
1 11 week0 30 0 59.97
1 12 week0 55 0 48.42
1 13 week0 36 0 50.36
1 14 week0 38 0 40.93
1 15 week0 66 0 55.07
1 16 week0 41 0 44.43
1 17 week0 45 0 43.93
1 18 week0 39 0 36.82
1 19 week0 24 0 38.14
1 20 week0 38 0 39.01
2 1 week0 52 0 46.95
2 2 week0 30 0 44.68
2 3 week0 65 0 45.36
2 4 week0 37 0 47.56
2 5 week0 59 0 64.10
2 6 week0 30 0 40.30
2 7 week0 69 0 52.29
2 8 week0 62 0 45.64
2 9 week0 38 0 39.79
2 10 week0 65 0 51.05
2 11 week0 78 0 57.68
2 12 week0 38 0 46.13
2 13 week0 63 0 48.07
2 14 week0 40 0 38.64
2 15 week0 40 0 52.78
2 16 week0 54 0 42.14
2 17 week0 33 0 41.63
2 18 week0 28 0 34.53
2 19 week0 52 0 35.85
2 20 week0 47 0 36.71
1 1 week1 36 1 47.97
1 2 week1 68 1 43.72
1 3 week1 55 1 44.47
1 4 week1 77 1 47.41
1 5 week1 75 1 62.11
1 6 week1 43 1 40.04
1 7 week1 61 1 51.11
1 8 week1 36 1 46.36
1 9 week1 43 1 39.11
1 10 week1 51 1 50.79
1 11 week1 34 1 58.01
1 12 week1 52 1 45.55
1 13 week1 32 1 47.26
1 14 week1 35 1 38.73
1 15 week1 68 1 51.69
1 16 week1 35 1 41.50
1 17 week1 38 1 42.30
1 18 week1 35 1 35.03
1 19 week1 28 1 35.66
1 20 week1 34 1 36.35
2 1 week1 73 1 46.39
2 2 week1 23 1 42.14
2 3 week1 31 1 42.89
2 4 week1 31 1 45.83
2 5 week1 67 1 60.53
2 6 week1 33 1 38.47
2 7 week1 52 1 49.53
2 8 week1 54 1 44.79
2 9 week1 40 1 37.53
2 10 week1 44 1 49.21
2 11 week1 95 1 56.44
2 12 week1 41 1 43.97
2 13 week1 65 1 45.68
2 14 week1 37 1 37.16
2 15 week1 36 1 50.11
2 16 week1 45 1 39.92
2 17 week1 41 1 40.72
2 18 week1 30 1 33.45
2 19 week1 43 1 34.09
2 20 week1 36 1 34.78
1 1 week2 36 2 46.69
1 2 week2 61 2 40.46
1 3 week2 41 2 41.28
1 4 week2 49 2 44.96
1 5 week2 72 2 57.82
1 6 week2 41 2 37.49
1 7 week2 47 2 47.64
1 8 week2 38 2 44.79
1 9 week2 39 2 36.14
1 10 week2 51 2 48.23
1 11 week2 34 2 56.06
1 12 week2 49 2 42.68
1 13 week2 36 2 44.15
1 14 week2 36 2 36.53
1 15 week2 65 2 48.30
1 16 week2 45 2 38.56
1 17 week2 46 2 40.67
1 18 week2 27 2 33.24
1 19 week2 31 2 33.19
1 20 week2 27 2 33.70
2 1 week2 42 2 45.83
2 2 week2 32 2 39.60
2 3 week2 33 2 40.42
2 4 week2 27 2 44.10
2 5 week2 58 2 56.96
2 6 week2 37 2 36.63
2 7 week2 41 2 46.78
2 8 week2 49 2 43.93
2 9 week2 38 2 35.28
2 10 week2 31 2 47.37
2 11 week2 75 2 55.20
2 12 week2 36 2 41.82
2 13 week2 60 2 43.29
2 14 week2 31 2 35.67
2 15 week2 55 2 47.44
2 16 week2 35 2 37.70
2 17 week2 30 2 39.81
2 18 week2 29 2 32.38
2 19 week2 26 2 32.33
2 20 week2 32 2 32.84
1 1 week3 43 3 45.42
1 2 week3 55 3 37.20
1 3 week3 38 3 38.08
1 4 week3 54 3 42.52
1 5 week3 65 3 53.54
1 6 week3 38 3 34.94
1 7 week3 30 3 44.17
1 8 week3 38 3 43.22
1 9 week3 35 3 33.17
1 10 week3 55 3 45.68
1 11 week3 41 3 54.11
1 12 week3 54 3 39.81
1 13 week3 31 3 41.04
1 14 week3 34 3 34.33
1 15 week3 49 3 44.92
1 16 week3 42 3 35.63
1 17 week3 38 3 39.04
1 18 week3 25 3 31.46
1 19 week3 28 3 30.71
1 20 week3 25 3 31.04
2 1 week3 41 3 45.27
2 2 week3 24 3 37.06
2 3 week3 28 3 37.94
2 4 week3 31 3 42.37
2 5 week3 61 3 53.39
2 6 week3 33 3 34.79
2 7 week3 33 3 44.02
2 8 week3 39 3 43.08
2 9 week3 27 3 33.02
2 10 week3 34 3 45.53
2 11 week3 76 3 53.96
2 12 week3 27 3 39.66
2 13 week3 53 3 40.90
2 14 week3 38 3 34.18
2 15 week3 55 3 44.78
2 16 week3 27 3 35.48
2 17 week3 32 3 38.89
2 18 week3 33 3 31.31
2 19 week3 27 3 30.57
2 20 week3 29 3 30.90
1 1 week4 41 4 44.14
1 2 week4 43 4 33.95
1 3 week4 43 4 34.89
1 4 week4 56 4 40.07
1 5 week4 50 4 49.25
1 6 week4 36 4 32.38
1 7 week4 27 4 40.69
1 8 week4 31 4 41.65
1 9 week4 28 4 30.19
1 10 week4 53 4 43.12
1 11 week4 36 4 52.15
1 12 week4 48 4 36.94
1 13 week4 25 4 37.94
1 14 week4 25 4 32.13
1 15 week4 36 4 41.54
1 16 week4 31 4 32.69
1 17 week4 40 4 37.41
1 18 week4 29 4 29.67
1 19 week4 29 4 28.23
1 20 week4 25 4 28.39
2 1 week4 39 4 44.72
2 2 week4 20 4 34.52
2 3 week4 22 4 35.47
2 4 week4 31 4 40.65
2 5 week4 49 4 49.83
2 6 week4 28 4 32.95
2 7 week4 34 4 41.27
2 8 week4 55 4 42.23
2 9 week4 31 4 30.77
2 10 week4 39 4 43.70
2 11 week4 66 4 52.72
2 12 week4 29 4 37.51
2 13 week4 52 4 38.51
2 14 week4 35 4 32.70
2 15 week4 42 4 42.11
2 16 week4 25 4 33.26
2 17 week4 46 4 37.98
2 18 week4 30 4 30.24
2 19 week4 24 4 28.81
2 20 week4 25 4 28.96
1 1 week5 40 5 42.87
1 2 week5 34 5 30.69
1 3 week5 28 5 31.70
1 4 week5 50 5 37.63
1 5 week5 39 5 44.97
1 6 week5 29 5 29.83
1 7 week5 40 5 37.22
1 8 week5 26 5 40.08
1 9 week5 22 5 27.22
1 10 week5 43 5 40.57
1 11 week5 36 5 50.20
1 12 week5 43 5 34.06
1 13 week5 25 5 34.83
1 14 week5 27 5 29.92
1 15 week5 32 5 38.16
1 16 week5 31 5 29.76
1 17 week5 33 5 35.78
1 18 week5 28 5 27.88
1 19 week5 21 5 25.76
1 20 week5 27 5 25.73
2 1 week5 38 5 44.16
2 2 week5 20 5 31.98
2 3 week5 25 5 32.99
2 4 week5 26 5 38.92
2 5 week5 38 5 46.26
2 6 week5 26 5 31.12
2 7 week5 37 5 38.51
2 8 week5 51 5 41.37
2 9 week5 24 5 28.51
2 10 week5 34 5 41.86
2 11 week5 64 5 51.49
2 12 week5 27 5 35.35
2 13 week5 32 5 36.12
2 14 week5 30 5 31.21
2 15 week5 30 5 39.45
2 16 week5 22 5 31.04
2 17 week5 43 5 37.07
2 18 week5 26 5 29.17
2 19 week5 32 5 27.05
2 20 week5 23 5 27.02
1 1 week6 38 6 41.60
1 2 week6 28 6 27.43
1 3 week6 29 6 28.51
1 4 week6 47 6 35.18
1 5 week6 32 6 40.69
1 6 week6 33 6 27.28
1 7 week6 30 6 33.75
1 8 week6 26 6 38.51
1 9 week6 20 6 24.25
1 10 week6 43 6 38.01
1 11 week6 38 6 48.24
1 12 week6 37 6 31.19
1 13 week6 21 6 31.73
1 14 week6 25 6 27.72
1 15 week6 27 6 34.78
1 16 week6 29 6 26.82
1 17 week6 27 6 34.15
1 18 week6 21 6 26.09
1 19 week6 22 6 23.28
1 20 week6 21 6 23.08
2 1 week6 43 6 43.60
2 2 week6 19 6 29.44
2 3 week6 24 6 30.52
2 4 week6 24 6 37.19
2 5 week6 37 6 42.69
2 6 week6 27 6 29.28
2 7 week6 37 6 35.76
2 8 week6 55 6 40.52
2 9 week6 22 6 26.25
2 10 week6 41 6 40.02
2 11 week6 64 6 50.25
2 12 week6 21 6 33.20
2 13 week6 37 6 33.73
2 14 week6 33 6 29.73
2 15 week6 26 6 36.78
2 16 week6 22 6 28.82
2 17 week6 43 6 36.15
2 18 week6 36 6 28.10
2 19 week6 21 6 25.29
2 20 week6 23 6 25.08
1 1 week7 47 7 40.32
1 2 week7 28 7 24.18
1 3 week7 25 7 25.32
1 4 week7 42 7 32.74
1 5 week7 38 7 36.40
1 6 week7 27 7 24.72
1 7 week7 31 7 30.28
1 8 week7 25 7 36.94
1 9 week7 23 7 21.28
1 10 week7 39 7 35.46
1 11 week7 36 7 46.29
1 12 week7 36 7 28.32
1 13 week7 19 7 28.62
1 14 week7 26 7 25.52
1 15 week7 30 7 31.40
1 16 week7 26 7 23.88
1 17 week7 31 7 32.52
1 18 week7 25 7 24.31
1 19 week7 23 7 20.81
1 20 week7 19 7 20.43
2 1 week7 62 7 43.04
2 2 week7 18 7 26.90
2 3 week7 31 7 28.04
2 4 week7 26 7 35.46
2 5 week7 36 7 39.12
2 6 week7 23 7 27.44
2 7 week7 38 7 33.00
2 8 week7 59 7 39.66
2 9 week7 21 7 24.00
2 10 week7 42 7 38.18
2 11 week7 60 7 49.01
2 12 week7 22 7 31.04
2 13 week7 52 7 31.34
2 14 week7 30 7 28.24
2 15 week7 30 7 34.12
2 16 week7 22 7 26.60
2 17 week7 43 7 35.24
2 18 week7 33 7 27.03
2 19 week7 21 7 23.53
2 20 week7 23 7 23.15
1 1 week8 51 8 39.05
1 2 week8 28 8 20.92
1 3 week8 24 8 22.13
1 4 week8 46 8 30.29
1 5 week8 32 8 32.12
1 6 week8 25 8 22.17
1 7 week8 31 8 26.81
1 8 week8 24 8 35.37
1 9 week8 21 8 18.31
1 10 week8 32 8 32.90
1 11 week8 36 8 44.33
1 12 week8 31 8 25.45
1 13 week8 22 8 25.52
1 14 week8 26 8 23.32
1 15 week8 37 8 28.02
1 16 week8 30 8 20.95
1 17 week8 27 8 30.89
1 18 week8 20 8 22.52
1 19 week8 22 8 18.33
1 20 week8 21 8 17.77
2 1 week8 50 8 42.48
2 2 week8 20 8 24.35
2 3 week8 32 8 25.57
2 4 week8 23 8 33.73
2 5 week8 35 8 35.55
2 6 week8 21 8 25.61
2 7 week8 35 8 30.25
2 8 week8 66 8 38.81
2 9 week8 21 8 21.74
2 10 week8 39 8 36.34
2 11 week8 75 8 47.77
2 12 week8 23 8 28.89
2 13 week8 28 8 28.95
2 14 week8 27 8 26.75
2 15 week8 37 8 31.45
2 16 week8 22 8 24.38
2 17 week8 43 8 34.33
2 18 week8 30 8 25.96
2 19 week8 21 8 21.77
2 20 week8 23 8 21.21
# Draw the plot of fitted bprs values
ggplot(bprs_l, aes(x = week, y = fitted_bprs, linetype = subject)) +
  geom_line(color = "darkred") +
  scale_linetype_manual(values = rep(1:6, times = 4)) +
  facet_grid(. ~ treatment, labeller = label_both) +
  scale_x_continuous(name = "Weeks", breaks = seq(0,8,1)) + 
  scale_y_continuous(name = "BPRS values modeled", breaks = seq(10,100,5)) + 
  theme(legend.position = "bottom") + 
  ggtitle("BPRS: fitted values by treatment") +
  theme(legend.box.background = element_rect(),legend.box.margin = margin(2, 2, 2, 2))